All Lord Knight of Weymouth contributions to the Data Protection Act 2018

Mon 6th November 2017
15 interactions (943 words)
Tue 10th October 2017
Data Protection Bill [HL]
Lords Chamber

2nd reading (Hansard - continued): House of Lords
Home Office
3 interactions (1,941 words)

Data Protection Bill [HL]

(Committee: 2nd sitting (Hansard): House of Lords)
Lord Knight of Weymouth Excerpts
Monday 6th November 2017

(3 years, 6 months ago)

Lords Chamber

Read Full debate Read Hansard Text
Department for Digital, Culture, Media and Sport
Lord Storey Portrait Lord Storey (LD)
- Hansard - - - Excerpts

My Lords, as I have said on a number of occasions, my previous job for 40 years was a teacher, 20 of those as a head teacher. One of my prime responsibilities as a head teacher was the safeguarding of children in my school. That was the most important thing I did: to make sure they were safe, so that those primary-age children, aged from five to 11, and nursery as well, could enjoy their childhood and their parents could know that they were safe and enjoying their innocence.

The Government did a lot with their education policies about safeguarding. Anyone visiting the school had to be checked and double-checked and had to wear identification. Children who went out of school had to be escorted properly and correctly. As part of our personal and social health education, we made sure that young people themselves understood. Yet, when it comes to this area, we seem not to take the role as seriously as we should. I was reading the newspapers on the train from Liverpool this morning. I just could not believe the Times headline:

“Children as young as ten are sexting”.

The article says that,

“according to figures from the National Police Chiefs Council. In 2015-16, there were 4,681 cases”,

where children as young as 10 were either sending inappropriate messages or photographs to other pupils or receiving them. Imagine it was your daughter who at the age of seven or eight—and some of them are that young—was receiving inappropriate pictures from other pupils. How would you feel as a parent? Is that really protecting or safeguarding those children?

I do not want to speak at length in this debate; I think the noble Baronesses, Lady Kidron and Lady Harding, have said it all. It is not beyond our wit to do these simple things. I have seen for myself that self-regulation does not work. I hope that between now and Report the Government will put aside any feeling that, “We can’t do this because of the EU, because of our own lethargy, because of what we have said in the past or because it will create more regulation”. This is about children. Let us all agree that on Report we can agree these eminently sensible amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I support the amendments. I remind the House of my interests in relation to my work at TES, the digital education company.

The noble Baroness, Lady Kidron, and the others who have supported the amendment have given the Government a pretty neat way out of the problem that 13 as the age of consent for young people to sign up to “information society services”, as the Bill likes to call them, feels wrong. I have found that for many Members of your Lordships’ House, 16 feels like a safer and more appropriate age, for all the reasons that the noble Lord, Lord Storey, has just given in terms of defining when children are children. There is considerable discomfort about 13 in terms of where the Bill currently sits.

However, I think many noble Lords are realists and understand that to some extent the horse has bolted. Given the huge numbers of young people currently signing up to these services who are under 13, trying to pretend that we can find a way of forcing the age up to 16 from the accepted behavioural norm of 13 looks challenging. Yet we want to protect children. So the question is whether these amendments would provide that solution. That hinges on whether it is reasonable to ask the suppliers of information society services to verify age, and whether it is then reasonable to ask them to design in an age-appropriate fashion. From my experience, the answer to both is yes, it is. Currently, all you do is tick a box to self-verify that you are the age you are. If subsequently you want to have your data deleted, you may have to go through a whole rigmarole to prove that you are who you are and the age you say you are, but for some reason the service providers do not require the same standard of proof and efficacy at the point where you sign up to them. That is out of balance, and it is effectively our role to put it back into balance.

The Government themselves, through the Government Digital Service, have an exceedingly good age-verification service called, strangely, Verify. It does what it says on the tin, and it does it really well. I pay tribute to the GDS for Verify as a service that it allows third parties to use: it is not used solely by Government.

So age verification is undoubtedly available. Next, is it possible—this was explored in previous comments, so I will not go on about it—for age-appropriate design to be delivered? From our work at TES, I am familiar with how you personalise newsfeeds based on data, understanding and profiling of users. It is worth saying, incidentally, that those information society services providers will be able to work out what age their users are from the data that they start to share: they will be able to infer age extremely accurately. So there is no excuse of not knowing how old their users are. Any of us who use any social media services will know that the feeds we get are personalised, because they know who we are and they know enough about us. It is equally possible, alongside the content that is fed, to shift some aspects of design. It would be possible to filter content according to what is appropriate, or to give a slightly different homepage, landing page and subsequent pages, according to age appropriateness.

I put it to the Minister, who I know listens carefully, that this is an elegant solution to his problem, and I hope that he reflects, talks to his colleague the right honourable Matthew Hancock, who is also a reasonable Minister, and comes back with something very similar to the amendments on Report, assuming that they are not pressed at this stage.

Baroness Hollins Portrait Baroness Hollins (CB)
- Hansard - - - Excerpts

My noble friend made a very strong case. The internet was designed for adults, but I think I am right in saying that 25% of time spent online is spent by children. A child is a child, whether online or offline, and we cannot treat a 13 year-old as an adult. It is quite straightforward: the internet needs to be designed for safety. That means it must be age appropriate, and the technology companies need to do something about it. I support the amendments very strongly.

--- Later in debate ---
Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Stevenson, said that he hoped I had a sense of where the Committee is coming from. I very much have a sense of that. I know that child online safety is an issue that is taken seriously by all noble Lords in the House, and it has been the subject of much debate apart from today. I am therefore grateful to the noble Baroness and to all who contributed for introducing this important subject. I assure all noble Lords that we have an open mind. However, I will pour a bit of cold water because some issues, to which we may well come back, need to be thought about. I apologise to the noble Baroness, Lady Kidron, for the fact that we have not met. I thought that we were arranging a meeting. I have certainly talked to my noble friend Lady Harding about these amendments. However, I repeat not only to her but to every noble Lord that I am very happy to talk to anyone about these matters before Report, and I have no doubt that I will be talking to the noble Baroness before too long.

At Second Reading we heard a good deal about the need to improve online safety and concerns about the role that social media companies play in young people’s lives. The Government are fully committed to this cause. Our approach has been laid out in the Internet Safety Strategy Green Paper, published earlier this month. In that strategy, the Government detailed a number of commitments to improve online safety for all users and issued a consultation on further work, including the social media code of practice, the social media levy and transparency reporting. Although the Government are currently promoting a voluntary approach to work with industry, we have clearly stated in the strategy—and I repeat it now—that legislation will be introduced if necessary, and this will be taken forward in the digital charter.

The Government’s clear intention is to educate all users on the safe use of online sites such as social media sites. Again, this is set out in the strategy. This includes efforts targeted at children, comprising working with civil society groups to support peer-to-peer programmes and revised national curriculums. We believe that education is fundamental to safe use of the internet because it enables users to build the skills and resilience needed to navigate the online world and to be capable of adapting to the continuous changes and innovations that we see in this space.

The aim of these amendments is to allow information society services to make use of the derogation in the GDPR to set the age threshold at 13 only if sites comply with guidance on the minimum standards of age-appropriate design as set out by the Information Commissioner. Although the Government are sympathetic to their goal to raise the level of safety online, we have some questions about how it would work in practice and some fundamental concerns about its possible unintended consequences.

The noble Lord, Lord Storey, said that we should not rest our case on EU law. That is an enticing argument, especially from a Liberal Democrat, but I think that there is a sense of frustration there and I would not hold him to that. However, the fact is that, as we discussed last week, we are determined to ensure that we preserve the free flow of data once the UK leaves the EU.

I have to raise the issue of compliance with the GDPR, because we have a very real concern that these amendments are not compatible with it. The GDPR was designed as a regulation to ensure harmonisation of data protection laws across the EU. The nature of the internet and the transnational flow of data that it entails mean that effective regulations need international agreement. However, these amendments would create additional burdens for data controllers. Article 8 of the GDPR says that member states may provide by law for a lower age but it does not indicate that exercising this derogation should be conditional on other requirements. These amendments go further than permitted, creating a risk for our future trading relationships.

The noble Baroness mentioned that she had advice from a prominent QC. If she would care to share that with us, I would be happy to discuss it with her, and we will put that in front of our lawyers as well. I have an open mind on this but we think that there is an issue as far as the GDPR’s compatibility is concerned.

Amendment 155 would require the Information Commissioner to produce guidance on standards and design. The Information Commissioner will already be providing guidance on minimum standards to comply with the requirement not to offer services to under-13s without parental consent. Indeed, it will be the role of the commissioner to enforce the new law on consent. Although the guidance will not include details on age-appropriate design, this is not something that should be overlooked by government. However, tackling the problem of age-appropriate design is not just a data protection issue, and we should be very cautious about using this age threshold as a tool to keep children off certain sites. This is about their data and not the more fundamental question of the age at which children should be able to use these sites.

We need to educate children and work with internet companies to keep them safe and allow them to benefit from being online. Where there is clearly harmful material, such as online pornography, we have acted to protect children through a requirement for age verification in the Digital Economy Act 2017. The Government’s Internet Safety Strategy addresses a wide range of ways to protect the public online. While online safety, particularly for children, is very important, we should not be confusing this with the age at which parental consent is no longer required for the processing of personal data by online services. The Government have a clear plan of action.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

I apologise to the Minister for interrupting. I am just interested in that confusion that he talks about. Perhaps I am incorrect, but I understand that images, for example, are data. There is a lot of concern about sexting and about platforms such as Snapchat and the sharing of data. Where is the confusion? Is it in the Government, or in the Chamber?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I do not think I mentioned confusion. What we are talking about in the Bill is purely data protection. We are talking about the age at which children can consent to information society services handling their data. What I think the noble Baroness, and a lot of Peers in the House, are talking about is keeping children safe online, which is more than just protection of their personal data.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am happy to confirm those two points. On extraterritoriality, I agree with the noble Baroness that it is difficult to control. Commercial sites are easier—an example of which is gambling. We can control the payments, so if they are commercial and cannot pay people, they may well lose their attractiveness. Of course, the only way to solve this is through international agreement, and the Government are working on that. Part of my point is that, if you drive children away to sites located abroad, there is a risk in that. The big, well-known sites are by and large responsible. They may not do what we want, but they will work with the Government. That is the thrust of our argument. We are working with the well-known companies and, by and large, they act responsibly, even if they do not do exactly what we want. As I say, however, we are working on that. The noble Baroness is right to say that, if we drive children on to less responsible sites based in jurisdictions with less sensible and acceptable regimes, that is a problem.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

Could the Minister help me with any information he might have about when the GDPR was drawn up? It must have been envisaged when Article 8 was put together that some member states would go with something different—be it 13, 16, or whatever. The issue of foreign powers must have been thought about, as well as verifying age, parental consent, or the verification of parental identity to verify age. Article 8 just talks about having to have parental sign-off. These issues of verification and going off to foreign powers must have been thought about when the article was being put together in Europe. Does he have any advice on what they thought would be done about this problem?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I cannot give the noble Lord chapter and verse on what the European bureaucrats were thinking when they produced the article, but age verification is not really the issue on this one, because it is extremely difficult to verify ages below 18 anyway. Although one can get a driving licence at 17, it is at the age of 18 when you can have a credit card. As I say, the issue here is not age verification—rather, it is about how, when we make things too onerous, that has the potential to drive people away on to other sites which take their responsibilities less seriously. That was the point I was trying to make.

--- Later in debate ---

We are therefore at an important time. By agreeing this amendment, we can ensure that PSHE will be the vehicle by which these issues can be taught to all children in all schools. I hope that when we come to Report the Minister will be able to report that that will be the case. I beg to move.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

My Lords, does the Minister agree with the noble Lord, Lord Storey, that PSHE would be the most appropriate way to educate young people about data rights? If so, I note that the Secretary of State, Justine Greening, has today announced that Ian Bauckham will lead the review on how relationship and sex education for the 21st century will be delivered. Can the Minister, who is clearly prepared to think about this appointment today, ask whether it is within his scope to think about how data rights education may be delivered as part of that review, and whether the review will draw on the work of the previous person who reviewed the delivery of PSHE, Sir Alasdair Macdonald, the last time Parliament thought that compulsory SRE was a good idea?

Baroness Kidron Portrait Baroness Kidron
- Hansard - - - Excerpts

I support the amendment. I was on the House of Lords Communications Committee, to which the noble Lord just referred. We recommended that digital literacy be given the same status as reading, writing and arithmetic. We set out an argument for a single cross-curricular framework of digital competencies—evidence-based, taught by trained teachers—in all schools whatever their legal status.

At Second Reading, several noble Lords referred to data as the new oil. I have been thinking about it since: I am not so certain. Oil may one day run out; data is infinite. What I think we can agree is that understanding how data is gathered, used and stored, and, most particularly, how it can be harnessed to manipulate both your behaviour and your digital identity, is a core competency for a 21st-century child. While I agree with the noble Lord that the best outcome would be a single, overarching literacy strategy, this amendment would go some small way towards that.

Data Protection Bill [HL] Debate

Full Debate: Read Full Debate
Department: Home Office

Data Protection Bill [HL]

(2nd reading (Hansard - continued): House of Lords)
Lord Knight of Weymouth Excerpts
Tuesday 10th October 2017

(3 years, 7 months ago)

Lords Chamber

Read Full debate Read Hansard Text
Home Office
Lord Janvrin Portrait Lord Janvrin (CB)
- Hansard - - - Excerpts

My Lords, I welcome the opportunity to speak in this Second Reading debate. It is always slightly daunting to follow the noble Lord, Lord Lucas. We were colleagues on the Digital Skills Committee a few years back, and he was pretty daunting on that too, being a great fund of knowledge on this subject. I mention at the outset my interests as set out in the register, including as a trustee of the British Library and as a member of the parliamentary Intelligence and Security Committee in the last Parliament. I too welcome this important piece of legislation. I will be brief and confine myself to some general remarks.

There is no doubt that data, big data, data processing and data innovation are all absolutely essential ingredients in the digital revolution which is changing the world around us. However, as we have discussed in debates in this House, advances in technology inevitably risk outstripping our capacity to think through some of the social, ethical and regulatory challenges posed by these advances. This is probably true of questions of data protection.

The last key legislation, the Data Protection Act 1998, was ground-breaking in its time. But it was designed in a different age, when the internet was in its infancy, smartphones did not exist and the digital universe was microscopic compared to today. As the Government have said, we desperately need a regulatory framework which is comprehensive and fit for purpose for the present digital age.

As has been mentioned by other noble Lords, the Bill is also necessary to ensure that our legislation is compatible with the GDPR, which comes into force next year. It is absolutely clear that however Brexit unfolds, our ability to retain an accepted common regulatory framework for handling data is essential; the ability to move data across borders is central to our trading future. I was much struck by the lucid explanation given by the noble Lord, Lord Jay, of some of the challenges which lie ahead in achieving this goal of a common regulatory framework for the future.

The Bill before us is undoubtedly a major advance on our earlier legislation. It is inevitably complex, and as today’s debate makes absolutely clear, there are areas which this House will wish to scrutinise carefully and in depth, including issues of consent and the new rights such as the right to be forgotten and to know when personal data has been hacked, and so on. The two areas which will be of particular interest to me as a member of the board of the British Library and as a member of the Intelligence and Security Committee in the last Parliament will be, first and foremost, archiving in the public interest, and secondly, Part 4, on data processing by the intelligence services.

In order to support archiving activities, as was made clear in the British Library’s submission during the DCMS consultation earlier this year, it is essential that this legislation provide a strong and robust legal basis to support public and private organisations which are undertaking archiving in the public interest. As I understand it, this new legislation confirms the exemptions currently available in the UK Data Protection Act 1998: safeguarding data processing necessary for archiving purposes in the public interest and archiving for scientific, historical and statistical purposes. This is welcome, but there may perhaps be issues around definitions of who and what is covered by the phrase “archiving in the public interest”. I look forward to further discussion and, hopefully, further reassurances on whether the work of public archiving institutions such as our libraries and museums is adequately safeguarded in the Bill.

On Part 4, data processing by the intelligence services does not fall within scope of the GDPR, and this part of the Bill provides a regime based on the Council of Europe’s modernised—but not yet finally agreed—Convention 108. The intelligence services already comply with data-handling obligations within the regulatory structures found in a range of existing legislation. This includes the Investigatory Powers Act 2016, which, as was debated in this Chamber this time last year, creates a number of new offences if agencies wrongly disclose data using the powers in that Act.

The new Bill seeks to replicate the approach of the Data Protection Act 1998, whereby there have been well-established exemptions to safeguard national security. It is obviously vital that the intelligence services be able to continue to operate effectively at home and with our European and other partners, and I look forward to our further discussion during the passage of the Bill on whether this draft legislation gives the intelligence services the safeguards they require to operate effectively.

In sum, this is a most important piece of legislation. If, as the noble Baroness, Lady Lane-Fox, suggests, we can set the bar high, it will be a most significant step forward. First, it will redefine the crucial balance between, on the one hand, the freedom to grasp the extraordinary opportunities offered by the new data world we are in and, on the other, the need to protect sensitive personal data. Secondly, and very importantly, it will put the United Kingdom at the forefront of wider efforts to regulate sensibly and pragmatically the digital revolution which is changing the way we run our lives.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, as the economy becomes more digitised, the politics of data become centrally important. As the Minister himself said, data is the fuel of the digital economy, and public policy now needs an agile framework around which to balance the forces at play. We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data. The recent theft of the personal details of 143 million Americans in the hack of Equifax or the unfolding story of abuse of social media in the US elections by Russian agents make the obvious case for data protection.

This Bill attempts to help us tackle some big moral and ethical dilemmas, and we as parliamentarians have a real struggle to be sufficiently informed in a rapidly changing and innovative environment. I welcome the certainty that the Bill gives us in implementing the GDPR in this country in a form that anticipates Brexit and the need to continue to comply with EU data law regardless of membership of the EU in the future.

However, we need e-privacy alongside the GDPR. For example, access to a website being conditional on accepting tracking cookies should be outlawed; we need stricter rules on wi-fi location tracking; browsers should have privacy high by default; and we need to look at extending the protections around personal data to metadata derived from personal data.

But ultimately I believe that the GDPR is an answer to the past. It is a long-overdue response to past and current data practice, but it is a long way from what the Information Commissioner’s briefing describes as,

“one of the final pieces of much needed data protection reform”.

I am grateful to Nicholas Oliver, the founder of, and to Gi Fernando from Freeformers for helping my thinking on these very difficult issues.

The Bill addresses issues of consent, erasure and portability to help protect us as citizens. I shall start with consent. A tougher consent regime is important but how do we make it informed? Even if 13 is the right age for consent, how do we inform that consent with young people, with parents, with adults generally, with vulnerable people and with small businesses which have to comply with this law? Which education campaigns will cut through in a nation where 11 million of us are already digitally excluded and where digital exclusion does not exclude significant amounts of personal data being held about you? And what is the extent of that consent?

As an early adopter of Facebook 10 years ago, I would have blindly agreed to its terms and conditions that required its users to grant it,

“a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content”.

I posted on the site. It effectively required me to give it the right to use my family photos and videos for marketing purposes and to resell them to anybody. Thanks to this Bill, it will be easier for me to ask it to delete that personal data and it will make it easier for me to take it away and put it goodness knows where else with whatever level of security I deem fit, if I can trust it. That is welcome, although I still quite like Facebook, so I will not do it just yet.

But what about the artificial intelligence generated from that data? If, in an outrageous conflagration of issues around fake news and election-fixing by a foreign power to enable a reality TV star with a narcissistic personality disorder to occupy the most powerful executive office in the free world, I take against Facebook, can I withdraw consent for my data to be used to inform artificial intelligences that Facebook can go on to use for profit and for whatever ethical use they see fit? No, I cannot.

What if, say, Google DeepMind got hold of NHS data and its algorithms were used with bias? What if Google gets away with breaking data protection as part of its innovation and maybe starts its own ethics group, marking its own ethics homework? Where is my consent and where do I get a share of the revenue generated by Google selling the intelligence derived in part from my data? And if it sells that AI to a health company which sells a resulting product back to the NHS, how do I ensure that the patients are advantaged because their data was at the source of the product?

No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead. Users can no longer reasonably comprehend the risk associated with data sharing, and so cannot reasonably be asked to give consent.

The individual as a data controller also becomes central. I have plenty of names, addresses, phone numbers and email addresses, and even the birthdays of my contacts in my phone. Some are even Members of your Lordships’ House. If I then, say, hire a car and connect my phone to the car over Bluetooth so that I can have hands-free driving and music from my phone, I may then end up sharing that personal contact data with the car and thereby all subsequent hirers of the car. Perhaps I should be accountable with the car owner for that breach.

Then, thanks to AI, in the future we will also have to resolve the paradox of consent. If AI determines that you have heart disease by facial recognition or by reading your pulse, it starts to make inference outside the context of consent. The AI knows something about you, but how can you give consent for it to tell you when you do not know what it knows? Here, we will probably need to find an intermediary to represent the interests of the individual, not the state or wider society. If the AI determines that you are in love with someone based on text messages, does the AI have the right to tell you or your partner? What if the AI is linked to your virtual assistant—to Siri or Google Now—and your partner asks Siri whether you are in love with someone else? What is the consent regime around that? Clause 13, which deals with a “significant decision”, may help with that, but machine learning means that some of these technologies are effectively a black box where the creators themselves do not even know the potential outcomes.

The final thing I want to say on consent concerns the sensitive area of children. Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting and so on. I am an advocate of the uses of these technologies. Many have seamless integration with the school management information systems that thereby expose children’s personal data to third parties based on digital contracts. Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.

Then there is the collection of data by schools to populate the national pupil database held by the Department for Education. This database contains highly sensitive data about more than 8 million children in England and is routinely shared with academic researchers and other government departments. The justification for this data collection is not made clear by the DfE and causes a big workload problem in schools. Incidentally, this is the same data about pupils that was shared with the Home Office for it to pursue deportation investigations. I am talking about data collected by teachers for learning being used for deportation. Where is the consent in that?

I have here a letter from a Lewisham school advising parents of its privacy policy. It advises parents to go to a government website to get more information about how the DfE stores and uses the data, if they are interested. That site then advises that the Government,

“won’t share your information with any other organisations for marketing, market research or commercial purposes”.

That claim does not survive any scrutiny. For example, Tutor Hunt, a commercial tutoring company, was granted access to the postcode, date of birth and unique school reference number of all pupils. This was granted for two years up to the end of March this year to give parents advice on school choice. Similar data releases have been given to journalists and others. It may be argued that this data is still anonymous, but it is laughable to suggest that identity cannot then be re-engineered, or engineered in the first place, from birth date, postal code and school. The Government need to get their own house in order to comply with the Bill.

That leads me to erasure, which normally means removing all data that relates to an individual, such as name, address and so on. The remaining data survives with a unique numeric token as an identifier. Conflicting legislation will continue to require companies to keep data for accounting purposes. If that includes transactions, there will normally be enough data to re-engineer identity from an identity token number. There is a clause in the Bill to punish that re-engineering, which needs debating to legitimise benign attempts to test research and data security, as discussed by the noble Baroness, Lady Manningham-Buller.

The fact that the Bill acknowledges how easy it is to re-identify from anonymous data points to a problem. The examples of malign hacking from overseas are countless. How do we prevent that with UK law? What are the Government’s plans, especially post Brexit, to address this risk? How do we deal with the risk of a benign UK company collecting data with consent—perhaps Tutor Hunt, which I referred to earlier—that is then acquired by an overseas company, which then uses that data free from the constraints of this legislation?

In the context of erasure, let me come to an end by saying that the Bill also allows for the right to be forgotten for children as they become 18. This is positive, as long as the individual can choose what they want to keep for him or herself. Otherwise, it would be like suggesting you burn your photo albums to stop an employer judging you.

Could the Minister tell me how the right to be forgotten works with the blockchain? These decentralised encrypted trust networks are attractive to those who do not trust big databases for privacy reasons. By design, data is stored in a billion different tokens and synced across countless devices. That data is immutable. Blockchain is heavily used in fintech, and London is a centre for fintech. But the erasure of blockchain data is impossible. How does that work in this Bill?

There is more to be said about portability, law enforcement and the intelligence services, but thinking about this Bill makes my head hurt. Let me close on a final thought. The use of data to fuel our economy is critical. The technology and artificial intelligence it generates have a huge power to enhance us as humans and to do good. That is the utopia we must pursue. Doing nothing heralds a dystopian outcome, but the pace of change is too fast for us legislators, and too complex for most of us to fathom. We therefore need to devise a catch-all for automated or intelligent decisioning by future data systems. Ethical and moral clauses could and should, I argue, be forced into terms of use and privacy policies. That is the only feasible way to ensure that the intelligence resulting from the use of one’s data is not subsequently used against us as individuals or society as a whole. This needs urgent consideration by the Minister.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, many noble Lords will know that my particular interests, clearly stated on the register, are concerned with making the digital world fit for children and young people, and so the greater part of my comments concern that. However, I wanted to say at the outset that dealing with this Bill without having had the opportunity to scrutinise the GDPR or understand the ambition and scope of the Government’s digital charter, their internet safety strategy or even some of the details that we still await on the Digital Economy Act made my head hurt also.

I start with the age of consent. Like others, I am concerned that the age of 13 was a decision reached not on the advice of child development experts, child campaigners or parents. Perhaps most importantly of all, the decision lacks the voice of young people. They are key players in this: the early adopters of emerging technologies, the first to spot its problems and, so very often, the last to be consulted or, indeed, not consulted at all. Also, like others, I was bewildered when I saw Clause 187. Are Scottish children especially mature or are their southern counterparts universally less so? More importantly, it seems that we have to comply with the GDPR, except when we do not.

As the right reverend Prelate has outlined, the age of 13 is really an age of convenience. We have simply chosen to align UK standards with COPPA, a piece of US legislation that its own authors once described to me as a “terrible compromise”, and which dates from 2000, when the notion of every child carrying a smartphone with the processing power of “Apollo 11” and consulting it every few minutes, hundreds of times day and night, was not even in our imagination, let alone our reality.

Before considering whether 13 is the right age, we should understand what plans the Government have to require tech companies to make any provisions for those aged 13 to 17, or whether it is the considered opinion of the UK Government that in the digital environment a 13 year-old is a de facto adult. Will the Government require tech companies to publish data risk assessments setting out how children are likely to engage with their service at different ages and the steps they have taken to support them, including transparent reporting data? Are we to have minimum design standards in parts of the digital environment that children frequent, and that includes those places that they are not supposed to be? Will the ICO have powers to enforce against ISS providers which do not take steps to prevent very young children accessing services designed for people twice their age? My understanding is that age compliance will continue to be monitored and enforced by the ISS companies themselves.

As Ofcom pointed out, in 2016 in the UK, 21% of 10 year-olds, 43% of 11 year-olds and half of all 12 year-olds had a social media profile, in spite of COPPA. Are the Government planning to adequately resource and train all front-line workers with children, teachers, parents and children in a programme of digital literacy as the House of Lords Communications Committee called for, and in doing so inform all concerned—those 13 and under and those between the ages of 13 and 18—on the impact for young people of inhabiting what is increasingly a commercial environment? Until these questions are answered positively, the argument for a hard age of consent seems weak.

In contrast, in its current code of practice on processing personal data online, the ICO recommends a nuanced approach, advising would-be data collectors that:

“Assessing understanding, rather than merely determining age, is the key to ensuring that personal data about children is collected and used fairly”.

The current system places the obligation on the data controller to consider the context of the child user, and requires them to frame and direct the request appropriately. It underpins what we know about childhood: that it is a journey from dependence to autonomy, from infancy to maturity. Different ages require different privileges and levels of support.

If being GDPR compliant requires a hard age limit, how do we intend to verify the age of the child in any meaningful way without, perversely, collecting more data from children than we do from adults? Given that the age of consent is to vary from country to country—16 in the Netherlands, Germany and Hungary; 14 in Austria—data controllers will also need to know the location of a child so that the right rules can be applied. Arguably, that creates more risk for children, but definitely it will create more data.

In all of this we must acknowledge a child’s right to access the digital world knowledgeably, creatively and fearlessly. Excluding children is not the answer, but providing a digital environment fit for them to flourish in must be. There is not enough in this Bill to fundamentally realign young people’s relationship with tech companies when it comes to their data.

Much like the noble Lord, Lord Knight, my view is that we have got this all wrong. In the future, the user will be the owner of their own data, with our preferences attached to our individual online identity. Companies and services will sign up to our bespoke terms and conditions, which will encompass our interests and tolerances, rather than the other way round. If that sounds a little far-fetched, I refer noble Lords to the IEEE, where this proposal is laid out in considerable detail. For those who do not know the IEEE, it is the pre-eminent global organisation of the electrical engineering professions.

While this rather better option is not before us today, it must inform our understanding that the Bill is effectively supporting an uncomfortable status quo. Challenging the status quo means putting children first, for example by putting the code of practice promised in the Digital Economy Act on a statutory footing so that it is enforceable; by imposing minimum design standards where the end-user is likely or may be a child; by publishing guidance to the tech companies on privacy settings, tracking, GPS and so forth; by demanding that they meet the rights of young people in the digital environment; and by a much tougher, altogether more appropriate, regime for children’s data.

All that could and should be achieved by May, because it comes down to the small print and the culture of a few very powerful businesses for which our children are no match. The GDPR offers warm words on consumer rights, automated profiling and data minimisation, but with terms and conditions as long as “Hamlet”, it is disingenuous to believe that plain English or any number of tick boxes for informed or specific consent will materially protect young people from the real-life consequences of data harvesting, which are intrusive, especially when we have left the data poachers in charge of the rules of engagement.

We could do better—a lot better. I agree wholeheartedly with other noble Lords who are looking for structures and principles that will serve us into the future. Those principles should not only serve us in terms of other EU member states but be bold enough to give us a voice in Silicon Valley. In the meantime, the Government can and should enact the derogation under article 80(2) and in the case of complainants under the age of 18, it should not only be a right but a requirement. We cannot endorse a system where we create poster children on front-line battles with tech companies. We are told that this Bill is about data protection for individuals—a Bill that favours users over business and children over the bottom line. But the absence of Article 8 of the European Charter of Fundamental Rights is an inexcusable omission. The Bill in front of us is simply not robust enough to replace Article 8. I call on the Government to insert that crucial principle into UK legislation. It must be wrong for our post-Brexit legislation to be deliberately absent of underlying principles. It is simply not adequate.

I had a laundry list of issues to bring to Committee, but I think I will overlook them. During the debate, a couple of noble Lords asked whether it was possible to regulate the internet. We should acknowledge that the GDPR shows that it can be done, kicking and screaming. It is in itself a victory for a legislative body—the EU. My understanding is that it will set a new benchmark for data-processing standards and will be adopted worldwide to achieve a harmonised global framework. As imperfect as it is, it proves that regulating the digital environment, which is entirely man and woman-made and entirely privately owned, is not an impossibility but a battle of societal need versus corporate will.

As I said at the beginning, my central concern is children. A child is a child until they reach maturity, not until they reach for their smart phone. Until Mark Zuckerberg, Sergey Brin and Larry Page, Tim Cook, Jack Dorsey and the rest, with all their resources and creativity, proactively design a digital environment that encompasses the needs of children and refers to the concept of childhood, I am afraid that it falls to us to insist. The Bill as it stands, even in conjunction with the GDPR, is not insistent enough, which I hope as we follow its passage is something that we can address together.