Artificial Intelligence (Select Committee Report)

Baroness Kidron Excerpts
Monday 19th November 2018

(6 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I warmly congratulate the noble Lord, Lord Clement-Jones, and the committee on both the tone of the report and the excellent set of recommendations. While leaving the broader questions to members of the committee, I will offer four brief thoughts that might find their place in the wider discussion today. In doing so, I refer the House to my interests in the register.

The first is about the ownership of vast datasets which, as the report says, are,

“fuelling the current AI boom”.

While we hold some rights over the initial transfer of our data, the more processes it is subjected to, the less control or even knowledge of its impact we have. On a recent trip to Silicon Valley, an engineer put it to me this way: “You may be the author of your data, but we own all that it implies about you and all it implies about others”. The data, the inferences and the knowledge it offers are closely guarded by a handful of Silicon Valley behemoths because it is so very valuable. It allows them to determine the choices and opportunities we are given or denied as individuals and communities, and as a society more broadly.

In the changing landscape of movie production, user behaviour, including the exact moment the viewer stopped watching, their age, socioeconomic group, familial relationships and, in some instances, even their shopping habits, last known holiday or current mood, is increasingly known. This data is used to make production decisions. My colleagues in the film business are increasingly anxious that the elements of production over which they have agency are diminishing, including the very stories that can be made.

This may be an area in which we do not traditionally worry about the control of AI over decision-making, but the stories we tell are an expression of our culture, our time and even occasionally our resistance. If the stories we are able to tell are determined by machine-learned patterns that reinforce what has gone before, is not the end game the miserable prospect that the story of our time will be a reflection of the lowest common denominator of what the AI says we like?

This example of the commercial control of data may be very specific, but I could easily have talked about Google’s monopoly over search ranking, Apple and Android’s gatekeeping role in relation to app stores or Facebook’s ability to make or break the spread of political advertising, so perhaps the Minister will say whether he believes that laws governing competition and market dominance are fit for a business model in which data is the currency?

My second point is that behind this wall of data control is information that it is in the public interest for us to have. For example, it is an ongoing struggle to get tech companies to share data about reporting and complaints they receive from children, particularly those children who do not meet the age restrictions of the services they are using.

The Government’s Internet Safety Strategy has proposed a draft transparency report and, in doing so, prompted both Google and Facebook into some pre-emptive reporting. But neither the government proposal nor the reports already proffered gives the sort of access to data needed to understand how children of different ages react to harms, which drivers of online experience create harm, which have the biggest impact on children’s mental health and well-being, whether platforms are effectively triaging complaints, and what solutions, both AI and human, are most effective in reducing the impacts of different sorts of harm. In short, access to far wider data is essential to gather the insights that would promote better outcomes or defences from harm. The ability to tackle harms at scale is hampered by this lack of access to commercial datasets. So I ask the Minister whether it is time to mandate licensed research access to privately held datasets where there is an overwhelming case of public interest.

That brings me to the question of considering children more broadly in design of service. In my work I speak to many engineers who design and build AI, almost all of whom admit that, until they met me, they had never considered the needs or consequences for children of the services they design. Many challenges faced by users online are commercially driven, intentional design choices. Such choices require universal standards and regulatory intervention, but others are due to a level of societal blindness on the part of those who create the systems. So, in addition to my strong support for all the recommendations relating to the education of children in schools, I impress upon the Minister the urgent need for professional qualifications in computer science and associated disciplines to have mandatory modules that cover rights by design—including safety by design, privacy by design and ethics by design—impact assessments and precautionary principles in the design of all AI for all users, but particularly children. Engineers are the drivers of tech culture, and an intervention in their training is a cheap and impactful way of tackling those aspects of AI design that are unconscious and unintended.

Finally, the committee’s report concludes that introducing AI-specific regulation would be less effective than ensuring that existing sector-specific regulation applies to AI decisions. I welcome this approach, but we need greater clarity on how existing protections apply to the digital environment. Just as the noble Baroness, Lady Buscombe, confirmed to the noble Lord, Lord Stevenson, that the Health and Safety at Work Act 1974 applies to AI, will the Minister confirm that the Equality Act 2010 and the Consumer Rights Act 2015 similarly apply? In a recent debate I floated the idea of an overarching harmonisation Bill that would operate in a similar way to Section 3 of the Human Rights Act by creating an obligation to interpret legislation in a way that creates parity of protection and redress online and offline to the extent that it is possible to do so. I did not get an answer in that debate, and I wonder whether I might be luckier today.

These are just four aspects of a broader need to hold those who design and control the digital environment to the same standards to which we hold the institutions and commercial players in other environments. This is not so much a brave new world as one that has yet to catch up and be subject to the principles inherent in parity, accountability, design standards and enforcement. Each of these offers an approach to the digital environment that would ensure that it meets the rights and needs of its users. I hope that the Minister will feel able to fully answer the points that I have raised. I welcome this excellent report.