Data Protection Bill [HL] DebateFull Debate: Read Full Debate
Lord Knight of WeymouthMain Page: Lord Knight of Weymouth (Labour - Life peer)
My Lords, as I have said on a number of occasions, my previous job for 40 years was a teacher, 20 of those as a head teacher. One of my prime responsibilities as a head teacher was the safeguarding of children in my school. That was the most important thing I did: to make sure they were safe, so that those primary-age children, aged from five to 11, and nursery as well, could enjoy their childhood and their parents could know that they were safe and enjoying their innocence.
The Government did a lot with their education policies about safeguarding. Anyone visiting the school had to be checked and double-checked and had to wear identification. Children who went out of school had to be escorted properly and correctly. As part of our personal and social health education, we made sure that young people themselves understood. Yet, when it comes to this area, we seem not to take the role as seriously as we should. I was reading the newspapers on the train from Liverpool this morning. I just could not believe the Times headline:
“Children as young as ten are sexting”.
The article says that,
“according to figures from the National Police Chiefs Council. In 2015-16, there were 4,681 cases”,
where children as young as 10 were either sending inappropriate messages or photographs to other pupils or receiving them. Imagine it was your daughter who at the age of seven or eight—and some of them are that young—was receiving inappropriate pictures from other pupils. How would you feel as a parent? Is that really protecting or safeguarding those children?
I do not want to speak at length in this debate; I think the noble Baronesses, Lady Kidron and Lady Harding, have said it all. It is not beyond our wit to do these simple things. I have seen for myself that self-regulation does not work. I hope that between now and Report the Government will put aside any feeling that, “We can’t do this because of the EU, because of our own lethargy, because of what we have said in the past or because it will create more regulation”. This is about children. Let us all agree that on Report we can agree these eminently sensible amendments.
My noble friend made a very strong case. The internet was designed for adults, but I think I am right in saying that 25% of time spent online is spent by children. A child is a child, whether online or offline, and we cannot treat a 13 year-old as an adult. It is quite straightforward: the internet needs to be designed for safety. That means it must be age appropriate, and the technology companies need to do something about it. I support the amendments very strongly.
My Lords, the noble Lord, Lord Stevenson, said that he hoped I had a sense of where the Committee is coming from. I very much have a sense of that. I know that child online safety is an issue that is taken seriously by all noble Lords in the House, and it has been the subject of much debate apart from today. I am therefore grateful to the noble Baroness and to all who contributed for introducing this important subject. I assure all noble Lords that we have an open mind. However, I will pour a bit of cold water because some issues, to which we may well come back, need to be thought about. I apologise to the noble Baroness, Lady Kidron, for the fact that we have not met. I thought that we were arranging a meeting. I have certainly talked to my noble friend Lady Harding about these amendments. However, I repeat not only to her but to every noble Lord that I am very happy to talk to anyone about these matters before Report, and I have no doubt that I will be talking to the noble Baroness before too long.
At Second Reading we heard a good deal about the need to improve online safety and concerns about the role that social media companies play in young people’s lives. The Government are fully committed to this cause. Our approach has been laid out in the Internet Safety Strategy Green Paper, published earlier this month. In that strategy, the Government detailed a number of commitments to improve online safety for all users and issued a consultation on further work, including the social media code of practice, the social media levy and transparency reporting. Although the Government are currently promoting a voluntary approach to work with industry, we have clearly stated in the strategy—and I repeat it now—that legislation will be introduced if necessary, and this will be taken forward in the digital charter.
The Government’s clear intention is to educate all users on the safe use of online sites such as social media sites. Again, this is set out in the strategy. This includes efforts targeted at children, comprising working with civil society groups to support peer-to-peer programmes and revised national curriculums. We believe that education is fundamental to safe use of the internet because it enables users to build the skills and resilience needed to navigate the online world and to be capable of adapting to the continuous changes and innovations that we see in this space.
The aim of these amendments is to allow information society services to make use of the derogation in the GDPR to set the age threshold at 13 only if sites comply with guidance on the minimum standards of age-appropriate design as set out by the Information Commissioner. Although the Government are sympathetic to their goal to raise the level of safety online, we have some questions about how it would work in practice and some fundamental concerns about its possible unintended consequences.
The noble Lord, Lord Storey, said that we should not rest our case on EU law. That is an enticing argument, especially from a Liberal Democrat, but I think that there is a sense of frustration there and I would not hold him to that. However, the fact is that, as we discussed last week, we are determined to ensure that we preserve the free flow of data once the UK leaves the EU.
I have to raise the issue of compliance with the GDPR, because we have a very real concern that these amendments are not compatible with it. The GDPR was designed as a regulation to ensure harmonisation of data protection laws across the EU. The nature of the internet and the transnational flow of data that it entails mean that effective regulations need international agreement. However, these amendments would create additional burdens for data controllers. Article 8 of the GDPR says that member states may provide by law for a lower age but it does not indicate that exercising this derogation should be conditional on other requirements. These amendments go further than permitted, creating a risk for our future trading relationships.
The noble Baroness mentioned that she had advice from a prominent QC. If she would care to share that with us, I would be happy to discuss it with her, and we will put that in front of our lawyers as well. I have an open mind on this but we think that there is an issue as far as the GDPR’s compatibility is concerned.
Amendment 155 would require the Information Commissioner to produce guidance on standards and design. The Information Commissioner will already be providing guidance on minimum standards to comply with the requirement not to offer services to under-13s without parental consent. Indeed, it will be the role of the commissioner to enforce the new law on consent. Although the guidance will not include details on age-appropriate design, this is not something that should be overlooked by government. However, tackling the problem of age-appropriate design is not just a data protection issue, and we should be very cautious about using this age threshold as a tool to keep children off certain sites. This is about their data and not the more fundamental question of the age at which children should be able to use these sites.
We need to educate children and work with internet companies to keep them safe and allow them to benefit from being online. Where there is clearly harmful material, such as online pornography, we have acted to protect children through a requirement for age verification in the Digital Economy Act 2017. The Government’s Internet Safety Strategy addresses a wide range of ways to protect the public online. While online safety, particularly for children, is very important, we should not be confusing this with the age at which parental consent is no longer required for the processing of personal data by online services. The Government have a clear plan of action.
I do not think I mentioned confusion. What we are talking about in the Bill is purely data protection. We are talking about the age at which children can consent to information society services handling their data. What I think the noble Baroness, and a lot of Peers in the House, are talking about is keeping children safe online, which is more than just protection of their personal data.
I am happy to confirm those two points. On extraterritoriality, I agree with the noble Baroness that it is difficult to control. Commercial sites are easier—an example of which is gambling. We can control the payments, so if they are commercial and cannot pay people, they may well lose their attractiveness. Of course, the only way to solve this is through international agreement, and the Government are working on that. Part of my point is that, if you drive children away to sites located abroad, there is a risk in that. The big, well-known sites are by and large responsible. They may not do what we want, but they will work with the Government. That is the thrust of our argument. We are working with the well-known companies and, by and large, they act responsibly, even if they do not do exactly what we want. As I say, however, we are working on that. The noble Baroness is right to say that, if we drive children on to less responsible sites based in jurisdictions with less sensible and acceptable regimes, that is a problem.
I cannot give the noble Lord chapter and verse on what the European bureaucrats were thinking when they produced the article, but age verification is not really the issue on this one, because it is extremely difficult to verify ages below 18 anyway. Although one can get a driving licence at 17, it is at the age of 18 when you can have a credit card. As I say, the issue here is not age verification—rather, it is about how, when we make things too onerous, that has the potential to drive people away on to other sites which take their responsibilities less seriously. That was the point I was trying to make.
We are therefore at an important time. By agreeing this amendment, we can ensure that PSHE will be the vehicle by which these issues can be taught to all children in all schools. I hope that when we come to Report the Minister will be able to report that that will be the case. I beg to move.
I support the amendment. I was on the House of Lords Communications Committee, to which the noble Lord just referred. We recommended that digital literacy be given the same status as reading, writing and arithmetic. We set out an argument for a single cross-curricular framework of digital competencies—evidence-based, taught by trained teachers—in all schools whatever their legal status.
At Second Reading, several noble Lords referred to data as the new oil. I have been thinking about it since: I am not so certain. Oil may one day run out; data is infinite. What I think we can agree is that understanding how data is gathered, used and stored, and, most particularly, how it can be harnessed to manipulate both your behaviour and your digital identity, is a core competency for a 21st-century child. While I agree with the noble Lord that the best outcome would be a single, overarching literacy strategy, this amendment would go some small way towards that.
Lord Knight of WeymouthMain Page: Lord Knight of Weymouth (Labour - Life peer)
Department Debates - View all Lord Knight of Weymouth's debates with the Home Office
My Lords, I welcome the opportunity to speak in this Second Reading debate. It is always slightly daunting to follow the noble Lord, Lord Lucas. We were colleagues on the Digital Skills Committee a few years back, and he was pretty daunting on that too, being a great fund of knowledge on this subject. I mention at the outset my interests as set out in the register, including as a trustee of the British Library and as a member of the parliamentary Intelligence and Security Committee in the last Parliament. I too welcome this important piece of legislation. I will be brief and confine myself to some general remarks.
There is no doubt that data, big data, data processing and data innovation are all absolutely essential ingredients in the digital revolution which is changing the world around us. However, as we have discussed in debates in this House, advances in technology inevitably risk outstripping our capacity to think through some of the social, ethical and regulatory challenges posed by these advances. This is probably true of questions of data protection.
The last key legislation, the Data Protection Act 1998, was ground-breaking in its time. But it was designed in a different age, when the internet was in its infancy, smartphones did not exist and the digital universe was microscopic compared to today. As the Government have said, we desperately need a regulatory framework which is comprehensive and fit for purpose for the present digital age.
As has been mentioned by other noble Lords, the Bill is also necessary to ensure that our legislation is compatible with the GDPR, which comes into force next year. It is absolutely clear that however Brexit unfolds, our ability to retain an accepted common regulatory framework for handling data is essential; the ability to move data across borders is central to our trading future. I was much struck by the lucid explanation given by the noble Lord, Lord Jay, of some of the challenges which lie ahead in achieving this goal of a common regulatory framework for the future.
The Bill before us is undoubtedly a major advance on our earlier legislation. It is inevitably complex, and as today’s debate makes absolutely clear, there are areas which this House will wish to scrutinise carefully and in depth, including issues of consent and the new rights such as the right to be forgotten and to know when personal data has been hacked, and so on. The two areas which will be of particular interest to me as a member of the board of the British Library and as a member of the Intelligence and Security Committee in the last Parliament will be, first and foremost, archiving in the public interest, and secondly, Part 4, on data processing by the intelligence services.
In order to support archiving activities, as was made clear in the British Library’s submission during the DCMS consultation earlier this year, it is essential that this legislation provide a strong and robust legal basis to support public and private organisations which are undertaking archiving in the public interest. As I understand it, this new legislation confirms the exemptions currently available in the UK Data Protection Act 1998: safeguarding data processing necessary for archiving purposes in the public interest and archiving for scientific, historical and statistical purposes. This is welcome, but there may perhaps be issues around definitions of who and what is covered by the phrase “archiving in the public interest”. I look forward to further discussion and, hopefully, further reassurances on whether the work of public archiving institutions such as our libraries and museums is adequately safeguarded in the Bill.
On Part 4, data processing by the intelligence services does not fall within scope of the GDPR, and this part of the Bill provides a regime based on the Council of Europe’s modernised—but not yet finally agreed—Convention 108. The intelligence services already comply with data-handling obligations within the regulatory structures found in a range of existing legislation. This includes the Investigatory Powers Act 2016, which, as was debated in this Chamber this time last year, creates a number of new offences if agencies wrongly disclose data using the powers in that Act.
The new Bill seeks to replicate the approach of the Data Protection Act 1998, whereby there have been well-established exemptions to safeguard national security. It is obviously vital that the intelligence services be able to continue to operate effectively at home and with our European and other partners, and I look forward to our further discussion during the passage of the Bill on whether this draft legislation gives the intelligence services the safeguards they require to operate effectively.
In sum, this is a most important piece of legislation. If, as the noble Baroness, Lady Lane-Fox, suggests, we can set the bar high, it will be a most significant step forward. First, it will redefine the crucial balance between, on the one hand, the freedom to grasp the extraordinary opportunities offered by the new data world we are in and, on the other, the need to protect sensitive personal data. Secondly, and very importantly, it will put the United Kingdom at the forefront of wider efforts to regulate sensibly and pragmatically the digital revolution which is changing the way we run our lives.
My Lords, many noble Lords will know that my particular interests, clearly stated on the register, are concerned with making the digital world fit for children and young people, and so the greater part of my comments concern that. However, I wanted to say at the outset that dealing with this Bill without having had the opportunity to scrutinise the GDPR or understand the ambition and scope of the Government’s digital charter, their internet safety strategy or even some of the details that we still await on the Digital Economy Act made my head hurt also.
I start with the age of consent. Like others, I am concerned that the age of 13 was a decision reached not on the advice of child development experts, child campaigners or parents. Perhaps most importantly of all, the decision lacks the voice of young people. They are key players in this: the early adopters of emerging technologies, the first to spot its problems and, so very often, the last to be consulted or, indeed, not consulted at all. Also, like others, I was bewildered when I saw Clause 187. Are Scottish children especially mature or are their southern counterparts universally less so? More importantly, it seems that we have to comply with the GDPR, except when we do not.
As the right reverend Prelate has outlined, the age of 13 is really an age of convenience. We have simply chosen to align UK standards with COPPA, a piece of US legislation that its own authors once described to me as a “terrible compromise”, and which dates from 2000, when the notion of every child carrying a smartphone with the processing power of “Apollo 11” and consulting it every few minutes, hundreds of times day and night, was not even in our imagination, let alone our reality.
Before considering whether 13 is the right age, we should understand what plans the Government have to require tech companies to make any provisions for those aged 13 to 17, or whether it is the considered opinion of the UK Government that in the digital environment a 13 year-old is a de facto adult. Will the Government require tech companies to publish data risk assessments setting out how children are likely to engage with their service at different ages and the steps they have taken to support them, including transparent reporting data? Are we to have minimum design standards in parts of the digital environment that children frequent, and that includes those places that they are not supposed to be? Will the ICO have powers to enforce against ISS providers which do not take steps to prevent very young children accessing services designed for people twice their age? My understanding is that age compliance will continue to be monitored and enforced by the ISS companies themselves.
As Ofcom pointed out, in 2016 in the UK, 21% of 10 year-olds, 43% of 11 year-olds and half of all 12 year-olds had a social media profile, in spite of COPPA. Are the Government planning to adequately resource and train all front-line workers with children, teachers, parents and children in a programme of digital literacy as the House of Lords Communications Committee called for, and in doing so inform all concerned—those 13 and under and those between the ages of 13 and 18—on the impact for young people of inhabiting what is increasingly a commercial environment? Until these questions are answered positively, the argument for a hard age of consent seems weak.
In contrast, in its current code of practice on processing personal data online, the ICO recommends a nuanced approach, advising would-be data collectors that:
“Assessing understanding, rather than merely determining age, is the key to ensuring that personal data about children is collected and used fairly”.
The current system places the obligation on the data controller to consider the context of the child user, and requires them to frame and direct the request appropriately. It underpins what we know about childhood: that it is a journey from dependence to autonomy, from infancy to maturity. Different ages require different privileges and levels of support.
If being GDPR compliant requires a hard age limit, how do we intend to verify the age of the child in any meaningful way without, perversely, collecting more data from children than we do from adults? Given that the age of consent is to vary from country to country—16 in the Netherlands, Germany and Hungary; 14 in Austria—data controllers will also need to know the location of a child so that the right rules can be applied. Arguably, that creates more risk for children, but definitely it will create more data.
In all of this we must acknowledge a child’s right to access the digital world knowledgeably, creatively and fearlessly. Excluding children is not the answer, but providing a digital environment fit for them to flourish in must be. There is not enough in this Bill to fundamentally realign young people’s relationship with tech companies when it comes to their data.
Much like the noble Lord, Lord Knight, my view is that we have got this all wrong. In the future, the user will be the owner of their own data, with our preferences attached to our individual online identity. Companies and services will sign up to our bespoke terms and conditions, which will encompass our interests and tolerances, rather than the other way round. If that sounds a little far-fetched, I refer noble Lords to the IEEE, where this proposal is laid out in considerable detail. For those who do not know the IEEE, it is the pre-eminent global organisation of the electrical engineering professions.
While this rather better option is not before us today, it must inform our understanding that the Bill is effectively supporting an uncomfortable status quo. Challenging the status quo means putting children first, for example by putting the code of practice promised in the Digital Economy Act on a statutory footing so that it is enforceable; by imposing minimum design standards where the end-user is likely or may be a child; by publishing guidance to the tech companies on privacy settings, tracking, GPS and so forth; by demanding that they meet the rights of young people in the digital environment; and by a much tougher, altogether more appropriate, regime for children’s data.
All that could and should be achieved by May, because it comes down to the small print and the culture of a few very powerful businesses for which our children are no match. The GDPR offers warm words on consumer rights, automated profiling and data minimisation, but with terms and conditions as long as “Hamlet”, it is disingenuous to believe that plain English or any number of tick boxes for informed or specific consent will materially protect young people from the real-life consequences of data harvesting, which are intrusive, especially when we have left the data poachers in charge of the rules of engagement.
We could do better—a lot better. I agree wholeheartedly with other noble Lords who are looking for structures and principles that will serve us into the future. Those principles should not only serve us in terms of other EU member states but be bold enough to give us a voice in Silicon Valley. In the meantime, the Government can and should enact the derogation under article 80(2) and in the case of complainants under the age of 18, it should not only be a right but a requirement. We cannot endorse a system where we create poster children on front-line battles with tech companies. We are told that this Bill is about data protection for individuals—a Bill that favours users over business and children over the bottom line. But the absence of Article 8 of the European Charter of Fundamental Rights is an inexcusable omission. The Bill in front of us is simply not robust enough to replace Article 8. I call on the Government to insert that crucial principle into UK legislation. It must be wrong for our post-Brexit legislation to be deliberately absent of underlying principles. It is simply not adequate.
I had a laundry list of issues to bring to Committee, but I think I will overlook them. During the debate, a couple of noble Lords asked whether it was possible to regulate the internet. We should acknowledge that the GDPR shows that it can be done, kicking and screaming. It is in itself a victory for a legislative body—the EU. My understanding is that it will set a new benchmark for data-processing standards and will be adopted worldwide to achieve a harmonised global framework. As imperfect as it is, it proves that regulating the digital environment, which is entirely man and woman-made and entirely privately owned, is not an impossibility but a battle of societal need versus corporate will.
As I said at the beginning, my central concern is children. A child is a child until they reach maturity, not until they reach for their smart phone. Until Mark Zuckerberg, Sergey Brin and Larry Page, Tim Cook, Jack Dorsey and the rest, with all their resources and creativity, proactively design a digital environment that encompasses the needs of children and refers to the concept of childhood, I am afraid that it falls to us to insist. The Bill as it stands, even in conjunction with the GDPR, is not insistent enough, which I hope as we follow its passage is something that we can address together.