59 Baroness Kidron debates involving the Department for Digital, Culture, Media & Sport

Thu 23rd Jan 2020
Thu 11th Jul 2019
Mon 8th Apr 2019
Wed 17th Jan 2018
Data Protection Bill [HL]
Lords Chamber

3rd reading (Hansard): House of Lords & Report: 2nd sitting (Hansard): House of Lords

Covid-19: Music Sector and Creative Economy

Baroness Kidron Excerpts
Thursday 23rd April 2020

(4 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait Baroness Barran
- Hansard - - - Excerpts

I thank my noble friend and agree on the critical part that this sector has played in the growth of the economy and the creation of high-quality jobs. As I said in response to an earlier question, our advice will be based on the science and the five tests that were highlighted earlier this week. I cannot add to that at this stage.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, the power of the creative industries is often expressed in its ability to generate £100 billion to the economy, but perhaps more important is its contribution to the national psyche. Since lockdown, one of the few bright spots in this difficult time has been to witness the irrepressible creativity and joy that results from the rise of book and film clubs, galleries and theatre online, ballet from the kitchen and rock legends performing in their bedrooms. Does the Minister recognise the need for a specific financial support package that does not look only to the immediate needs but rather recognises the atypical workforce and the length of time it takes to get an idea from page to public?

Lord Fowler Portrait The Lord Speaker (Lord Fowler)
- Hansard - - - Excerpts

Congratulations, Lady Kidron, on getting through all the electronic feedback while you were putting your question.

Cairncross Review

Baroness Kidron Excerpts
Thursday 6th February 2020

(4 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

To ask Her Majesty’s Government what steps they are taking in response to The Cairncross Review: a sustainable future for journalism.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I gently remind the House of the three-minute time limit. This is a time-limited debate, and it would be helpful if Members could please stick to that limit.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, it has been a year since Dame Frances Cairncross published her review, A Sustainable Future for Journalism. Cairncross’s remit was

“to consider the sustainability of the production and distribution of high-quality journalism, and especially the future of the press”.

The review’s six chapters outline: the importance of high-quality journalism to democracy; the rapidly changing market; the plummeting revenues of publishers; the huge power of the online platforms; and the need to protect public interest news. Sadly, the Government’s response does not comprehensively answer Dame Frances’s nine recommendations, nor does it fully address the two intrinsically linked systemic points that she highlights—notably, the impact of platforms as mediators on the quality of the news and the asymmetry of power between platform and publishers when it comes to revenue.

I declare my interests as set out in the register, particularly as a member of the House of Lords’ digital democracy inquiry committee and as chair of the 5Rights Foundation.

The most urgent issue raised repeatedly by Cairncross is how new distribution models for high-quality journalism have eroded revenue. This is a sector being hollowed out before our eyes, with reduced resources to hold institutions to account, as the platform model drives down quality in pursuit of profit. In her introduction, Cairncross points out:

“People read more sources of news online, but spend less time reading it than they did in print. They increasingly skim, scroll or passively absorb news, much of it ‘pushed’ news”,

which is

“based on data analytics and algorithms, the operation of which are often opaque.”

Platforms such as Facebook, Twitter, Google and YouTube measure views, likes and retweets, not the quality of the news they share. Under the guise of being “user first”, they are focused on building algorithms to increase engagement and, with it, their revenues—not on people’s understanding of what is happening in the world around them.

A user journey with a diet of financial, entertainment, political and international news as readers made their way from front page to sports page, has been replaced by unbundled news: bite-sized snacks driven by an opaque list of inputs that optimise user engagement; it is often difficult for readers to know or recall the source. Disaggregated news driven by commercial concerns necessarily interferes with a user journey based on editorial or public interest values. This business model enables disinformation to masquerade as news. It is not without consequences: the victims are children who get measles, pensioners who give up their savings and individuals who vote on false promises.

Cairncross recommended:

“New codes of conduct to rebalance the relationship between publishers and online platforms”,


underpinned by a news quality obligation under regulatory oversight. While the government response has warm words about these codes, it is unclear whether they are to be put on a statutory footing, silent on who will have oversight and offers no timetable. The news quality obligation becomes a vague sense that platforms must

“help users identify the reliability and trustworthiness of news sources”,

with allusions to the online harms White Paper. I do not understand why the Government commissioned a review on such an urgent matter, only for us to wait a year to hear that we will wait several more. Can the Minister outline the steps government will take to introduce new, effective codes of conduct and when we will begin to see them enforced? Also, what obstacles does she see to introducing a news quality obligation in response to the review, rather than waiting for an online harms Bill whose effect may not be felt for another couple of years?

As classified and display ads have moved wholesale from publishers to platforms, particularly Google, where targeted advertising is king, the duopoly of Google and Facebook have become eye-wateringly rich and the news sector increasingly poor. Meanwhile, news producers remain at the mercy of news feed algorithms that can, at the whim of a platform, be changed for no transparent reason, giving platforms the power to literally bury the news. Cairncross’s observation that the opaque advertising supply chain is weighted against content creators is not new. It was central to the Communications Committee’s report, UK Advertising in a Digital Age; it has been the subject of much complaint by advertisers themselves; and it is well laid out in the interim review from the CMA.

This dysfunctional business model hits the local press the hardest. The Yorkshire Evening Post showed its societal value by having local reporters when it broke the story of a child being treated on an NHS hospital floor. The subsequent false discrediting of the story on social media showed the financial value in misinformation. The editor’s plea to the digital democracy committee was that the Post needed a fairer share of the value of the content it produces. Without it, it simply cannot continue to put reporters on the front line.

Cairncross recommends an innovation fund, VAT exemption to match offline publishing and allowing local papers charitable status. The first of these is being done by NESTA, the second is being looked at by the Treasury, and the last the Government rejected outright, but at the heart of her recommendations was that the CMA should use its powers to investigate the advertising supply chain to ensure that the market be fair and transparent. Given the unanimity of this view, and the disproportionate control of the platforms, will the Minister tell the House whether she would like to see—as many of us would —the CMA move to a full market investigation to clean up the advertising supply chain?

Cairncross urged the extension of the Local Democracy Reporting Service but this has been interpreted by the Government as an extension of the BBC local news partnerships, with no additional funding, This is not an adequate response to the crisis in local journalism, nor does it fulfil the Government’s own promise to advocate for voters outside the metropole, whose local interests may be too small to be of financial value in the attention economy of the multinationals. Leaving whole parts of the country out of sight is not sustainable for our democracy.

The review also called for an Ofcom inquiry into the impact of BBC News on the commercial sector. However, I would argue that of greater concern are the recent announcements of large-scale cuts to BBC News. Amid the crisis in the local press, it is simply not the right time to undermine the BBC. In an era of catastrophically low trust, BBC News is uniquely trusted by 79% of the population—a statistic that any platform or politician would beg for.

Finally, the commitment from the Government to support media literacy is hugely welcome. The ability to identify the trustworthiness of a source and to understand the platform’s algorithms, how they impact on what you see and who benefits from your interactions is vital. But I urge the noble Baroness to make clear in her answer that media literacy is no substitute for cleaning up the hostile environment in which the news now sits.

I asked Frances Cairncross to comment on the government response to her review. She said it was

“of particular regret that the government rejected out of hand the idea of an Institute of public interest journalism.”

On another occasion, one might underline further the responsibility of the press to uphold their own editorial standards to a greater extent and better fulfil their own public interest role but, for today, I wish to congratulate Dame Frances on categorically making the case for high-quality journalism as a crucial safeguard to democracy.

I look forward to hearing from many knowledgeable colleagues and thank them in advance for their contributions. Since The Cairncross Review was published, the news sector has become more fragile, while the platforms’ power has become entrenched. I hope that the Minister—delightfully making her maiden speech in this debate—finds a way of reassuring the House that the Government intend to tackle the systemic issues that Cairncross has identified with the seriousness and urgency they require. I beg to move.

Digital Inclusion

Baroness Kidron Excerpts
Thursday 23rd January 2020

(4 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes
- Hansard - - - Excerpts

I thank my noble friend; he is absolutely right. My department has launched a digital inclusion innovation fund, designed to tackle digital exclusion among older and disabled people, and I have just talked about the qualifications. What he also hinted at is that, for many people, it is a case of simply finding it difficult to go online or to complete government forms. We want to make sure that there is support available; for example, in our network of around 3,000 libraries, in accessible locations, there are trained staff and volunteers and assisted access to a wide range of digital public services.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I add my welcome to those of others to the Secretary of State and refer the House to my interests in the register. Does she agree that inclusion is about more than getting the greatest number of people online as quickly as possible, and depends on the digital environment being designed in a way that respects the needs and rights of users, be they women in public life, vulnerable users, or children and young people? In particular, can she take the opportunity of welcoming the age-appropriate design code, published by the ICO yesterday, and tell the House when she expects to lay it before Parliament?

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes
- Hansard - - - Excerpts

I thank the noble Baroness. She and I had a brief conversation recently about some of these issues, and I look forward to discussing this further with her. She is absolutely right to say that the digital and tech environment is very exciting, but that it of course brings new challenges, not just about the new technology itself but about behaviours online. That is why the Government will legislate following the online harms White Paper and will develop further legislation. I welcome the publication yesterday by the Information Commissioner’s Office of the age-appropriate design code, and I hope that all parliamentarians will have the opportunity to take note of it.

Social Media

Baroness Kidron Excerpts
Thursday 11th July 2019

(4 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the right reverend Prelate for tabling today’s debate and draw the attention of the House to my interests as set out in the register. I very much welcome the Church of England’s social media guidelines. They have great force in their simplicity and generosity of spirit, and clearly outline our responsibilities to conduct our online interactions respectfully and honestly. I will focus my contribution on how they might be applied to the social media companies themselves.

For example, the first guideline is:

“Be safe. The safety of children, young people and vulnerable adults must be maintained”.


Far from taking reasonable steps to maintain the safety of children or to support their emotional and social development, social media companies refuse to even recognise the global consensus that a child is a person under the age of 18 as codified by the Convention on the Rights of the Child. Tick a box and a child of 13 can gain access to an environment that routinely exposes them to adult risks and deprives them of the rights that we have fought for decades to establish. Furthermore, minimum age limits are routinely bypassed and poorly enforced, a fact freely admitted by both Snap and Facebook when they appeared before Parliament in recent months. This leaves children of all ages unprotected through many of their most vulnerable years. For children to be safe online, social medial companies first have to provide a safe environment.

A similar scenario unfolds when you consider the guideline:

“Be honest. Don’t mislead people about who you are”.


The spread of misinformation and disinformation polarises debate, impacts on elections, drives the rise in intolerance and fuels spurious health claims and conspiracy theories. This is an area of considerable attention for legislators around the globe but, while much is said about those who create the misinformation, it is important to note that the platforms are not neutral bystanders. In an attention economy where clicks mean money, and the longer that someone stays on line the more you maximise your opportunity to serve them an ad or learn something about them that you can sell later, the spread of the extraordinary, the extreme or the loud is not an unintended consequence of your service; it becomes central to its purpose.

Being honest is not only about information but about the nature of the service itself. When we walk into a tea room, a cinema, a pub or a strip club, we understand the opportunities and risks that those environments offer and are given nuanced indicators about their suitability for ourselves or our children. Social media companies, by contrast, parade as tea rooms but behave like strip clubs. A simple answer would be greater honesty about what the nature of the service holds.

This leads me quite neatly to the guidance to,

“Follow the rules. Abide by the terms and conditions”.


Terms and conditions should enable users to decide whether a service is offering them an environment that will treat them fairly. They are, by any measure, a contract between user and platform; it is therefore unacceptable that these published rules are so opaque, so asymmetrical in the distribution of rights and responsibilities, so interminably long—and then so inconsistently and poorly upheld by the platforms themselves.

This failure to follow the rules is not without consequence. Noble Lords will remember the case of Molly Russell, who took her own life in 2017 after viewing and being auto-recommended graphic self-harm and suicide content. The spokesperson for one of the platforms responsible, Pinterest, said:

“Our existing self-harm policy already does not allow for anything that promotes self-harm. However, we know a policy isn’t enough. What we do is more important than what we say”.


Indeed, and while that tragedy has been widely and bravely publicised by Molly’s father, it is neither the only tragedy nor the only failure. Failure is built into the system. The responsibility for upholding terms and conditions must be a two-way street. I warmly welcome the Government’s proposal in the online harms White Paper:

“The regulator will assess how effectively these terms are enforced as part of any regulatory action”,


and I welcome the Information Commissioner’s similar commitment in the recently published age-appropriate design code.

Let me finish with this. On Monday, 22 children came to the House to see me and offer their thoughts on a 5Rights data literacy workshop that they had been doing for some months. Their observations can be usefully summed up by the fifth of the Church’s guidelines:

“Take responsibility. You are accountable for the things you do”.


These children and young people categorically understood their responsibilities, but they powerfully and explicitly expressed the requirement for the platforms to meet theirs too. It is for the platforms to make their services safe and respectful, for government to put in place the unavoidable requirement that they do so, and for the rest of us to keep speaking up until it is done. With that in mind, I commend the right reverend Prelate for his tireless work to that end and ask the Minister to reassure the House that the promises made to children and parents by the outgoing Executive will be implemented by the incoming Executive.

Regulating in a Digital World (Communications Committee Report)

Baroness Kidron Excerpts
Wednesday 12th June 2019

(4 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, it is always a pleasure to follow the noble Baroness, Lady Harding, who, not for the first time, has beautifully articulated some of my points. But I intend to repeat them, and I hope that they will emerge not as stolen thunder but as a common cause, and perhaps a storm around the House as others speak also.

Since my time on the committee shortly comes to an end, I take this opportunity to record my personal thanks to the noble Lord, Lord Gilbert, for his excellent chairmanship throughout, and to pay tribute to my colleagues, who make our meetings so fantastically interesting, collaborative and, occasionally, robust. I also thank the clerk, Theo Pembroke, who has always met our insatiable curiosity with extraordinary patience and good humour. I draw the attention of the House to my interests as set out in the register, particularly as chair of the 5Rights Foundation.

In its introduction, Regulating in a Digital World offers the following observation:

“The need for regulation goes beyond online harms. The digital world has become dominated by a small number of very large companies. These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses”.


Having heard from scores of witnesses and read a mountain of written evidence, the committee concludes that regulatory intervention is required to tackle this “power imbalance” between those who use technology and those who own it. As witness after witness pointed out,

“regulation of the digital world has not kept pace with its role in our lives”;

the tech sector’s response to “growing public concern” has been “piecemeal”; and effective, comprehensive, and future-proof regulation is urgent and long overdue. It is on this point of the how the sector has responded to these calls for regulation that I will address the bulk of my remarks today.

Earlier this year, Mark Zuckerberg said:

“I believe we need a more active role for government and regulators. By updating the rules for the internet, we can preserve what’s best about it ... while also protecting society from broader harms”.


Meanwhile, Jeff Bezos said that Amazon will,

“work with any set of regulations we are given. Ultimately, society decides that, and we will follow those rules, regardless of the impact that they have on our business”.

These are just two of several tech leaders who have publicly accepted the inevitability of a regulated online world, which should, in theory, make the implementation of regulation passed in this House a collaborative affair. However, no sooner is regulation drafted than the warm words of sector leaders are quickly replaced by concerted efforts to dilute, delay and disrupt. Rather than letting society decide, the tech sector is putting its considerable resource and creativity into preventing society, and society’s representatives, applying its democratically agreed rules.

The committee’s proposal for a digital authority would provide independence from the conflicts built into the DNA of DCMS, whose remit to innovate and grow the sector necessarily demands a hand-in-glove relationship but which also has a mandate to speak up for the rights and protections of users. More broadly, such an authority would militate against the conflicts between several government departments, which, in speaking variously and vigorously on digital matters across security, education, health and business, are ultimately divided in their purpose. In this divide and rule, the industry position that can be summed up as, “Yes, the status quo needs to change but it shouldn’t happen now or to me, and it mustn’t cost a penny” remains unassailable.

The noble Lord, Lord Gilbert, set out many of the 10 principles by which to shape regulation into an agreed and enforceable set of societal expectations, but they are worth repeating: parity on- and offline, accountability, transparency, openness, privacy, ethical design, recognition of childhood, respect for human rights and equality, education and awareness-raising, and democratic accountability. I want to pick up on one single aspect of design because, if we lived in a world in which the 10 principles were routinely applied, maybe I would not have been profoundly disturbed by an article by Max Fisher and Amanda Taub in the New York Times last week, which reported on a new study by researchers from Harvard’s Berkman Klein Center. The researchers found that perfectly innocent videos of children, often simply playing around outside, were receiving hundreds of thousands of views. Why? Because YouTube algorithms were auto-recommending the videos to viewers who had just watched “prepubescent, partially clothed children”. The American news network MSNBC put it a little more bluntly:

“YouTube algorithm recommends videos of kids to paedophiles”.


However, although YouTube’s product director for trust and safety, Jennifer O’Connor, is quoted as saying that,

“protecting kids is at the top of our list”,

YouTube has so far declined to make the one change that researchers say would prevent this happening again: to identify videos of prepubescent children— which it can do automatically—and turn off its auto-recommendation system on those videos.

The article goes on to describe what it calls the “rabbit hole effect”, which makes the viewing of one thing result in the recommendation of something more extreme. In this case, the researchers noticed that viewing sexual content led to the recommendation of videos of ever younger women, then young adults in school uniforms and gradually to toddlers in swimming costumes or doing the splits. The reason for not turning off the auto-recommend for videos featuring prepubescent children is—again, I quote the YouTube representative’s answer to the New York Times—because,

“recommendations are the biggest traffic driver; removing them would hurt ‘creators’ who rely on those clicks”.

This is what self-regulation looks like.

Auto-recommend is also at the heart of provision 11 in the ICO’s recently published Age Appropriate Design Code, which, as the right reverend Prelate said, is commonly known as the “kids’ code”. Conceived in this House and supported by many noble Lords who are in the Chamber tonight, provision 11 prevents a company using a child’s data to recommend material or behaviours detrimental to children. In reality, this provision, and the kids’ code in general, does no more than what Mark Zuckerberg and Jeff Bezos have agreed is necessary and publicly promised to adhere to. It puts societal rules—in this case, the established rights of children, including their right to privacy and protection—above the commercial interests of the sector and into enforceable regulation.

Sadly, and yet unsurprisingly, the trade association of the global internet companies here in the UK, the Internet Association, which represents, among others, Amazon, Facebook, Google, Twitter and Snapchat, is furiously lobbying to delay, dilute and disrupt the code’s introduction. The kids’ code offers a world in which the committee’s principle—the recognition of childhood—is fundamental; a principle that, when enacted, would require online services likely to be accessed by children to introduce safeguards for all users under the age of 18.

The Internet Association cynically argues that the kids’ code should be restricted to services that are “targeted at children”, in effect putting CBeebies and “Sesame Street” in scope, while YouTube, Instagram, Facebook, Snapchat, et cetera, would be free to continue to serve millions of children as they alone deem fit. The Internet Association has also demanded that children be defined only as those under 13, so that anyone over 13 is effectively treated like an adult. This is out of step with the Data Protection Act 2018 that we passed in this House with government agreement, which defines a child as a person under 18. Moreover, in the event that it is successful in derailing the code in this way, it would leave huge numbers of children unprotected during some of the most vulnerable years of their life.

Perhaps the most disingenuous pushback of all is the Internet Association’s claim that complying with regulations is not technically feasible. This is a sector that promises eye-watering innovation and technical prowess, that intends to get us to the moon on holiday and fill our streets with driverless cars. In my extensive conversations with engineers and computer scientists both in and out of the sector, no one has ever suggested that the kids’ code presents an insurmountable technical problem, a fact underlined by conversations I had in Silicon Valley only a few weeks ago. Yes, it requires a culture change and it may have a price, but the digital sector must accept, like all other industries have before it, that promoting children’s welfare—indeed, citizens’ and community welfare more generally—is simply a price of doing business. Let us not make the mistake of muddling up price and cost, since the cost of not regulating the digital world is one that our children are already paying.

Regulating in a Digital World establishes beyond doubt that if we want a better digital world, we must act now to shape it according to societal values, one of which is to recognise the vulnerabilities and privileges of childhood. I recognise and very much welcome the future plans of the Government in this area, but if we cannot get one exemplar code effectively and robustly into the real world, what message does that send to the sector about our seriousness in fulfilling the grand ambitions of the online harms White Paper?

When replying, could the Minister give some reassurance that the Government will indeed stand four-square behind the Information Commissioner and her ground-breaking kids’ code? In doing so, will they meet the expectations of parents, who have been promised a great deal by this Government but have not yet seen the change in the lived experience of their children. More importantly still, will they meet the needs and uphold the rights of UK children, rather than once again giving in to tech sector lobbying?

I will finish with the words of a 12 year-old boy who I met last Thursday in a 5Rights workshop. A self-professed lover of technology, he said, “They sacrifice people for cash. It makes me so angry. I can’t believe that people are so unnecessarily greedy”. His words, remarkable from someone so young, eloquently sum up the committee’s report.

Online Harms

Baroness Kidron Excerpts
Monday 8th April 2019

(5 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, with regard to disinformation connected with democracy and those essential questions, the White Paper deals with disinformation generally. With regard to electoral reform and how elections can be affected by the use of the internet, as I said, the Cabinet Office is bringing out a report soon to deal with that. It is right that constitutional affairs are dealt with there.

On disinformation, we have listed in the White Paper some of the areas we expect the regulator to include, such as:

“Promoting diverse news content … Improving the transparency of political advertising”—


noble Lords can read it themselves; there are other things. That is how we are trying to do it across government. As I said, there are other areas that we deliberately do not cover in the White Paper, but that should not be taken to mean that work is not going on. However, I accept the noble Lord’s suggestion that it is important and needs to be done soon. I take that on board.

As far as time is concerned, we are having a consultation, as the noble Lord said, which will end on 1 July. Obviously, it is not possible for me to say today when legislation will come before the House. That is a decision for the Government and the Leaders of both Houses. Judging by the discussions we have had today, and the feeling I get from across the House, all noble Lords think that this is an important issue. The Government think that this is an important issue. We are aware that we have taken time over the consultation. As far as the Home Office and DCMS are concerned, we want to get on with it.

We have just announced a review of advertising that will report in due course.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I too welcome the White Paper. I thank the Minister and the Secretary of State for being open to discussions during the process, and for indicating that there will be more discussions. I feel that more discussions are required because it is a little lacking in detail, and I share others’ concerns about the definition of harms. I was particularly upset to not see a little more work done on the everyday harms: the gaming, the gambling and the addictive loops that drive such unhealthy behaviours online. There are a lot of questions in the paper and I look forward to us all getting together to answer them—I hope quickly and soon. I really welcome the Minister’s words about the anxiety of the Government and both Houses to bring a Bill forward, because that is the litmus test of this White Paper: how quickly we get something on the books.

I feel encouraged by the noble Lord, Lord Griffiths, to mention that on Monday next week we have the launch of the final stage of the age-appropriate design code, which takes a safety-by-design approach. That is what I most welcome in the White Paper, in the Government’s attitude and in the work that we have in front of us: what we want to do is drive good behaviour. We want to drive corporate responsibility. We want to drive shareholders to take responsibility for those massive profits and to make sure that we do not allow the tech sector its exceptionality. It is a business like any other and it must do no harm. In relation to that I mention Will Perrin and Lorna Woods, who brought it forth and did so much work.

Finally, I am really grateful for what the Minister said about the international community. It is worth saying that these problems are in all parts of the world —we are not alone—and they wait and look at what we are doing. I congratulate the Government on acting first.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Obviously, there are details that need to be ironed out, and that is partly what the consultation is about. I expect there to be a lot of detail, which we will go over when a Bill finally comes to this House. In the past we have dealt with things like the Data Protection Act and have shown that we can do that well. The list in the White Paper of legal harms and everyday harms, as the noble Baroness calls them, is indicative. I completely agree with her that the White Paper is attempting to drive good behaviour. The difference it will make is that companies cannot now say, “It’s not my problem”. If we incorporate this safety by design, they will have to do that, because they will have a duty of care right from the word go. They cannot say, “It’s not my responsibility”, because we have given them the responsibility, and if they do not exercise it there will be serious consequences.

Children and Young People: Digital Technology

Baroness Kidron Excerpts
Thursday 17th January 2019

(5 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

That this House takes note of the relationship between the use of digital technology and the health and well-being of children and young people.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I am very grateful to all noble Lords who have chosen to speak this afternoon, and very much look forward to each of their contributions. I refer the House to my interests on the register, particularly that as founder and chair of 5Rights.

Fundamental to this debate is the fact that we invented a technology that assumes that all users are equal when, in fact, a third of users worldwide and a fifth of users in the UK are children. It has been 150 years since we pulled children out of the chimneys and put them into school. Since that time we have fought on their behalf for privileges, protections and inalienable rights that collectively constitute the concept of, and offer a legal framework, for, childhood.

Childhood is the journey from infancy to maturity, from dependence to autonomy. We design and mitigate for it in multiple ways across all aspects of society. We educate; we require doctors to obtain additional skills to practise paediatric medicine; we do not hold children to contractual obligations; we put pedestrian crossings near schools; we rate films according to age. Children have special protections around sexual activity. It is illegal for kids to smoke, drink and gamble. We even take steps to protect them in environments where adults smoke, drink and gamble.

In short, we provide a complex but widely understood and respected set of social norms, educational frameworks, regulatory interventions and national and international laws reflecting the global consensus that society as a whole must act in the best interests of the child, in the light of the vulnerabilities and immaturities associated with their age. The digital environment fails to reflect that consensus, and the cost of that failure is played out on the health and well-being of our children.

In setting out this afternoon’s debate, I shall concentrate on three areas: the nature of the digital environment, my concern about the way we conceive online harms and, finally, how we might support children to flourish. For children in the connected world, there is no off or on. Their lives are mediated by technological devices and services that capture infinitesimal detail about their activities, frame the choices available to them and make assumptions—not always accurate—about who they are. Theirs is not a world divided by real and virtual; it is a single lived experience augmented by technology. The vast majority of a child’s interactions are not deliberate decisions of a conscious mind but are predetermined. A child may consciously choose to play a game, but it is machine-engineered Pavlovian reward loops embedded in the game that keep them playing. A child may consciously opt to participate in a social group, but it is the stream of personalised alerts and the engineered measures of popularity that create the compulsive need to attend to that social group. A child may wish to look up a piece of information, but it is the nudge of promoted content and automated recommendation that largely determines what information they receive.

Those predetermined systems are predicated on a business model that profiles users for commercial purposes, yet businesses that sell devices and services in the digital environment deliver them to children with impunity—even though we know that screens eradicate the boredom and capacity for free play that very young children require to develop language, motor skills and imagination; even though we know that a single tired child, kept awake through the night by the hooks and notifications of a sector competing for their attention, affects the educational attainment of the entire class; and even though we know that for teenagers, the feedback loops of social validation and competition intrinsic to social media play an overwhelming role in their state of mind and ability to make safe choices.

The children we work with at 5Rights make the case that it is simply not possible to act your age online. As one young boy said, “Online, I am not a kid but an underage adult”. His Royal Highness the Duke of Cambridge said about the tech sector:

“Their self-image is so grounded in their positive power for good that they seem unable to engage in constructive discussion about the social problems that they are creating”,


including,

“fake news, extremism, polarisation, hate speech, trolling, mental health, privacy and bullying”.

Last year, I was in Africa when a young girl was auctioned as a bride on Facebook. I have sat with the parents of a child bullied to death online. I have been with a young girl at the devastating moment in which she realised that she had been taping sexual acts for a group, not just for the man with whom she thought she was in a relationship. I have been witness to scores of children who have ruined their family life, educational opportunities, reputation and self-esteem through overuse, misuse, misunderstandings and straightforward commercial abuse. An individual child does not, and should not be expected to, have the maturity to meet the social, sexual, political and commercial currency of the adult world.

In December, the Nurture Network, a multidisciplinary group of academics, mental health workers and child development experts, agreed that the three existing agencies of socialisation—family, friends and school—have now been joined by a fourth: the digital environment, an environment of socialisation in which the status of children is not recognised. In an interconnected world, the erosion of the privileges, protections and rights of childhood in one environment results in an erosion of childhood itself.

That brings me to my concerns about how we conceive harms. I will briefly raise three issues. First, our public discourse focuses on a narrow set of extreme harms of a violent or sexual nature. Ignoring so-called “lesser harms” misunderstands that for a child, harms are often cumulative. It fails to deal with the fact that one child will react violently to an interaction that does not harm another, or that vulnerable groups of children might merit specific and particular protection. Crucially, it ignores the fact that for most children, it is the quotidian that lowers their self-esteem, creates anxiety, and inflicts an opportunity cost in which education, relationships and physical and personal development are denuded, rendering children—or, should I say, “underage adults”?—exposed and unprotected. Children’s rights are deliberately conceived as non-hierarchical. We must take all harms seriously.

Secondly, it is not adequate to define children’s experience of the digital environment in terms of an absence of harm. As long ago as 1946, the World Health Organization declared that well-being was,

“not merely the absence of disease or infirmity”.

The NHS defines it as a feeling of “physical, emotional and psychological” well-being. We must set our sights not on the absence of harm but on a child’s right to well-being and human flourishing.

Thirdly, whether we are tackling the problems of live streaming, child sexual abuse, gaming addiction or thinking towards a new world order in which the fridge knows more about your child’s dietary tastes than you do and can exploit that fact, we must not wait until harm has been done but consider in advance the risks that children face. Technology changes fast, but the risks consistently fall into four categories: content risks, both unsuitable and illegal; contact risks, often, but not always, involving an adult; conduct risks, involving risky behaviour or social humiliation; and contract risks, such as exploitative contractual relationships, gambling, aggressive marketing, unfair terms and conditions, discriminatory profiling and so on. Most experts, including many in the enforcement community, consider that upstream prevention based on militating against risk rather than waiting for the manifestation of harm is by far the most effective approach.

There is much we can do. The Minister knows that I am not short of suggestions, but I will finish with a modest list. The digital environment is now indivisible from other environments in which our legal and regulatory arrangements embody our values. Parity of protection has been called for by the NSPCC. It was the approach taken in the Law Commission’s Abusive and Offensive Online Communications: A Scoping Report, and was articulated by the noble Lord, Lord Stevenson, in establishing that the Health and Safety at Work Act 1974 applies equally to artificial intelligence. What plans do the Government have to bring clarity to how our laws apply to the digital environment? Specifically, will the Government bring forward a harmonisation Bill to create an obligation to interpret legislation in a manner that offers parity of protection and redress online and offline, in a similar manner to Section 3 of the Human Rights Act?

Designing out known risk, often referred to as safety by design, is standard across other sectors. We like our brakes to work, our food to be free of poisons and our contracts to be fair in law. The Secretary of State has said that he is minded to introduce a duty of care on the sector. That is very welcome—but to be effective, it must be accompanied by impact assessments, design standards, transparency reporting, robust oversight and a regulator with the full toolkit of persuasion and penalty. Can the Minister confirm that the Government are planning this full suite of provisions?

The age-appropriate design code introduced by this House demands that companies anticipate the presence of children and meet their development needs in the area of data protection. I hope that the Minister will confirm the Government’s determination to produce a robust code across all areas of design agreed during the passage of the Data Protection Act. The code’s safety by design approach could and should be an exemplar of the codes and standards that must eventually form part of an online safety Bill.

Finally, companies make many promises in their published guidelines that set age limits, content rules and standards of behaviour, but then they do not uphold them. It is ludicrous that 61% of 12 year-olds have a social media account in spite of a joining age of 13, that Facebook says that it cannot work to its own definition of hate speech or that Twitter can have half a million pornographic images posted on it daily and still be characterised as a news app. Subjecting routine failure to uphold published terms to regulatory penalty would prevent companies entering into commercial contracts with underage children, drive services to categorise themselves accurately and ensure that companies say what they do, do what they said and are held to account if they fail to do it. I would be grateful if the Minister could confirm that this measure will be included in the upcoming White Paper.

Technology is often said to be neutral, and when we criticise the sector we are told that we are endangering its promise to cure cancer, educate the world and have us experience space travel without leaving our home, or threatening the future prosperity of the nation. Technology is indeed neutral, but we must ask to what end it is being deployed. It could in the future fulfil the hope of its founders and offer the beneficial outcomes for society that we all long for—but not if the price is the privileges, protections and inalienable rights of childhood. A child is a child until they reach maturity, not until the moment they reach for their smartphone.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

My Lords, this has turned into something of a “Today” programme moment, where, having been asked the question, you have no time at all to answer. I am very sorry about that but I thank everybody for their contributions. It has been a hugely interesting debate and very diverse. The one thing that I would like to say in concluding—

Social Media Services

Baroness Kidron Excerpts
Monday 12th November 2018

(5 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the noble Lord, Lord Stevenson of Balmacara, for introducing this timely debate and illustrating why it is so important. I also thank him for his kind words. I refer the House to my broad interests in this area.

The statutory duty of care as set out by Will Perrin and Professor Lorna Woods is an important and very welcome prospect. A duty of care is proportionate. The higher the risk, the greater the responsibility of the company to consider its impact in advance. A duty of care is a concept that users themselves can understand. It offers an element of future-proofing, since companies would have to evaluate the risk of a service or product failing to meet the standard of “reasonably foreseeable harm”. It would also ensure that powerful global companies that hide behind the status of being “mere conduits” are held responsible for the safety of the online services they provide. However, a duty of care works only if it applies to all digital services, all harms and all users.

The risks of drawing too narrowly the parameters to which services must comply is highlighted by the provisions of the Digital Economy Act 2017, which sought to restrict children’s access to pornography based on scale and yet failed to bring platforms such as Twitter within scope, despite 500,000 pornographic images being posted daily. Equally, if the duty of care applies to some harms and not others, the opportunity to develop a systemic approach will be missed. Many headlines are preoccupied with the harms associated with content or contact but there is a host of others. For example, behavioural design— otherwise known as “nudge and sludge”—is a central component of many of the services we use. The nudge pushes us to act in the interests of the online service, while the sludge features are those deliberately designed to undermine or obfuscate our ability to act in our own best interests. It is designed to be addictive and involves the deliberate manipulation of free will.

It is also necessary to consider how a duty of care characterises whom we are protecting. We know that children often experience specific harms online differently from adult users. Some categories of people whom we would not consider vulnerable in other settings become targets online—for example, female MPs or journalists. Some harms are prejudicial to whole groups. Examples are the racial bias found in algorithms used to determine bail conditions and sentencing terms in the US, or the evidence that just a handful of sleep-deprived children in a classroom diminishes the academic achievement of the entire class. Of course, there are harms to society as whole, such as the undeclared political profiling that influences electoral outcomes.

I understand that the proposal for a duty of care policy is still under consideration, but I would be grateful if the Minister would outline the Government’s current thinking about scope, including the type and size of services, what harms the Government seek to address and whether they will be restricted to harms against individuals.

When setting out their safety strategy in 2017, the Government made a commitment that what is unacceptable offline should be unacceptable online. That is an excellent place to start, not least because the distinction between online and offline increasingly does not apply. The harms we face are cross-cutting and only by seeing them as an integrated part of our new augmented reality can we begin to consider how to address them.

But defence against harm is not the only driver, we should hope that the technology we use is designed to fulfil our rights, to enable our development and to reflect the values embodied in our laws and international agreements. With that in mind, I propose four pillars of safety that might usefully be incorporated into a broader strategy: parity, safety by design, accountability and enforcement. Parity online and offline could be supported by the publication of guidance to provide clarity about how existing protections apply to the digital environment. The noble Lord, Lord Stevenson, mentioned the Health and Safety at Work Act and the Law Commission recently published a scoping report on abusive and offensive online communications.

Alongside such sector-by-sector analysis, the Government might also consider an overarching harmonisation Bill. Such a Bill would operate in a similar way to Section 3 of the Human Rights Act by creating an obligation to interpret legislation in a way that creates parity of protection and redress online and offline to the extent that it is possible to do so.

This approach applies also to international agreements. At the 5Rights Foundation we are supporting the United Nations Committee on the Rights of the Child in writing a general comment that will formally outline the relevance of the 40-plus articles of the charter to the digital environment. Clarifying, harmonising, consolidating and enhancing existing agreements, laws and regulations would underpin the parity principle and deliver offline norms and expectations in online settings. Will the Minister say whether the Government are considering this approach?

The second pillar is the widely supported principle of safety and privacy by design. In its March 2018 report Secure by Design the DCMS concluded that government and industry action was “urgently” required to ensure that internet-connected devices have,

“strong security … built in by design”.

Minimum universal standards are also a demand of the Department for Business, Energy and Industrial Strategy and the consumer organisation Which?. They are also a central concern of the Child Dignity Alliance technical working group to prevent the spread of images of child sexual abuse. It will publish its report and make recommendations on Friday.

We should also look upstream at the design of smart devices and operating systems. For example, if Google and Apple were to engineer safety and privacy by design into Android and IOS operating systems, it would be transformative.

There is also the age-appropriate design code that many of us had our names to. The Government’s response to the safety strategy acknowledges the code, but it is not clear that they have recognised its potential to address a considerable number of interrelated harms, nor its value as a precedent for safety by design that could be applied more widely. At the time, the Minister undertook that the Secretary of State would work closely in consultation with the Information Commissioner and me to ensure that the code is robust and practical, and meets the development needs of children. I ask the Minister to restate that commitment this evening.

The third pillar is accountability—saying what you will do, doing what you said and demonstrating that you have done it. Accountability must be an obligation, not a tool of lobbyists to account only for what they wish us to know. The argument made by services that they cannot publish data about complaints, or offer a breakdown of data by age, harm and outcome because of commercial sensitivities, remains preposterous. Research access to commercial data should be mandated so that we can have independent benchmarking against which to measure progress, and transparency reporting must be comprehensive, standardised and subject to regulatory scrutiny.

This brings me to enforcement. What is illegal should be clearly defined, not by private companies but by Parliament. Failure to comply must have legal consequences. What is contractually promised must be upheld. Among the most powerful ways to change the culture of the online world would be the introduction of a regulatory backstop for community standards, terms and conditions, age restrictions and privacy notices. This would allow companies the freedom to set their own rules, and routine failure by a company to adhere to its own published rules would be subject to enforcement notices and penalties.

Where users have existing vulnerabilities, a higher bar of safety by default must be the norm. Most importantly, the nuanced approaches that we have developed offline to live together must apply online. Any safety strategy worth its title must not balk at the complexity but must cover all harms from the extreme to the quotidian.

While it is inappropriate for me leap ahead of the findings of the House of Lords committee inquiry on who should be the regulator, it is clear that this is a sector that requires oversight and that all in the enforcement chain need resources and training.

I appreciate the Government’s desire to be confident that their response is evidence-based, but this is a fast- moving world. A regulator needs to be independent of industry and government, with significant powers and resources. The priorities of the regulator may change but the pillars—parity, safety by design, accountability and enforcement—could remain constant.

The inventor of the web, Sir Tim Berners-Lee, recently said that,

“the web is functioning in a dystopian way. We have online abuse, prejudice, bias, polarisation, fake news, there are lots of ways in which it is broken”.

It is time to fix what is broken. A duty of care as part of that fix is warmly welcome, but I hope that the Minister will offer us a sneak preview of a much bolder vision of what we might expect from the Government’s White Paper when it comes.

Data Protection Bill [HL]

Baroness Kidron Excerpts
Monday 14th May 2018

(5 years, 12 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, the main amendments in this group relate to the representation of data subjects by not-for-profit bodies. Last time we discussed this matter, the question before us was whether those bodies should have to seek the mandate—that is, the consent—of data subjects before pursuing claims on their behalf.

As I said then,

“the Government have reflected on the principles at stake here and agree it would be reasonable for a review to be undertaken, two years after Royal Assent, of the effectiveness of”—

Clause 183—

“as it is currently drafted. The Government are fully prepared to look again at the issue”,

of representation without prior mandate in the context of that review.

“We are serious about this. We will therefore amend the Bill in the other place to provide for such a review and to provide the power for the Government to implement its conclusions”.—[Official Report, 10/1/18; col. 287.]


Commons Amendments 122 and 123 duly deliver on that promise, while Commons Amendment 121 allows the Secretary of State to make regulations to ensure that, where a not-for-profit seeks to represent a large number of data subjects in court proceedings, it can file one claim and not hundreds.

I am grateful to the noble Baroness, Lady Kidron, for her continued engagement on this subject. She and I are in total agreement that children merit specific protection in relation to their personal data, and that the review should look accordingly at the specific barriers young people face in exercising their rights. Therefore, Commons Amendment 122 makes provision for that in subsections (4), (5) and (6) of the proposed new clause. Of course, as some noble Lords have mentioned previously, such provision is not to the exclusion of other vulnerable groups in our society, and the Government fully expect that review to consider their position, too.

Commons Amendment 126 would allow Her Majesty’s Revenue & Customs to share contact detail information with the Ministry of Defence to ensure that the Ministry of Defence is better able to locate and contact members of the ex-regular reserve. The amendment does not alter the liability for ex-regular reserves, nor does it affect the rules regarding the call-out or recall of ex-regular reserves; it is simply about being better able to contact them. The security of the United Kingdom is the primary responsibility of government. Commons Amendment 126 offers us the opportunity to strengthen that security.

Finally, Commons Amendment 282 would insert a schedule making transitional, transitory and saving provision in connection with the coming into force of the Bill, including provision about subject access requests, the Information Commissioner’s enforcement powers and national security certificates. This comprehensive new schedule, running to some 19 pages, is designed to ensure a seamless shift between the 1998 Act and the new data protection law we are scrutinising today. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the Government for listening, the Bill team, the Secretary of State and the Minister, Margot James. The point is that rights are only as good as one’s ability to enact them, so I really welcome the review and I thank all concerned for the very great care and detail with which they have laid it out in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, very briefly, we had considerable debate while the Bill was going through the House on whether we should incorporate Article 18(2) and we obviously did not prevail while the Bill was going through this House. Although this does not go as far as incorporating Article 18(2), which I regret—I would clearly like to see the whole loaf, so to speak—at least this gives the possibility of Article 18(2) being incorporated through a review. Will the Minister say when he thinks the review will be laid, in the form of a report? I am assuming that,

“within 30 months of commencement of the Bill”,

means within 30 months from 25 May this year. I am making that assumption so that we can all count the days to when the report will come back for debate in Parliament.

Data Protection Bill [HL]

Baroness Kidron Excerpts
3rd reading (Hansard): House of Lords & Report: 2nd sitting (Hansard): House of Lords
Wednesday 17th January 2018

(6 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 77-I Marshalled list for Third Reading (PDF, 71KB) - (16 Jan 2018)
Baroness Howe of Idlicote Portrait Baroness Howe of Idlicote (CB)
- Hansard - - - Excerpts

My Lords, I am pleased to speak to my Amendment 4, which I regard as small but important for the purposes of clarification.

Last month, there was universal support from your Lordships when my noble friend Lady Kidron introduced her excellent amendment on the age-appropriate design code, which is now the subject of Clause 124. At the time, I raised a question about the intention regarding the scope of the amendment, as there is no definition of “children” either in the amendment or in the Bill. I said that, as the amendment refers to the United Nations Convention on the Rights of the Child,

“I assume that the intention is that the age-appropriate design code of practice will cover all children up to the age of 18”.—[Official Report, 11/12/17; col. 1430.]

During the debate, my noble friend Lady Kidron said:

“The code created by the amendment will apply to all services,


‘likely to be accessed by children’,

irrespective of age and of whether consent has been asked for. This particular aspect of the amendment could not have been achieved without the help of the Government. In my view it is to their great credit that they agreed to extend age-appropriate standards to all children”.—[Official Report, 11/12/17; col. 1427.]

I was reassured by this statement about the intent of the clause but I remain concerned that there is no explicit definition in the Bill to indicate that we are indeed talking about any person under the age of 18, especially as the reference to the requirement to engage with the UN Convention on the Rights of the Child in Clause 124(4) is an obligation only to “have regard to”.

The truth is that there is no clear or consistent reference to a child or children in the Data Protection Bill. Clause 9 defines the right of a child to consent to their data’s use and says that this right starts at 13. Clause 201 covers children in Scotland, suggesting that there the right commences at the age of 12. These different approaches open up the door for arguments about the age at which the rights conferred by Clause 124 are operational for children. I would hate us to find ourselves in a position where, once this Bill was passed, a debate began about the ages at which the benefits of Clause 124 applied to children. This could result in a narrowing of the definition of children benefiting from Clause 124 so that it related only to some people under 18, rather than to all those under 18, on account of the Bill not being clear.

Years of experience have taught me that it is best to be crystal clear about what we are talking about, and that is why I have tabled this amendment. If the Government do not think it necessary, I hope the Minister will clearly state in his reply that the Government intend that Clause 124 should indeed relate to all persons under the age of 18. I look forward to hearing what he has to say. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I thank my noble friend for bringing this issue to the attention of the House. It is my understanding that, by invoking the UNCRC, we are talking about children being people under the age of 18. I would very much welcome the Minister’s saying that that extends beyond Clause 124, which we brought forward, to everywhere in the Bill that “children” is mentioned.

Lord Swinfen Portrait Lord Swinfen (Con)
- Hansard - - - Excerpts

My Lords, can the Minister tell the House at what age the United Nations considers that a child ceases to be a child?