Data Protection Bill [HL] Debate
Full Debate: Read Full DebateLord Alton of Liverpool
Main Page: Lord Alton of Liverpool (Crossbench - Life peer)Department Debates - View all Lord Alton of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(7 years ago)
Lords ChamberMy noble friend made a very strong case. The internet was designed for adults, but I think I am right in saying that 25% of time spent online is spent by children. A child is a child, whether online or offline, and we cannot treat a 13 year-old as an adult. It is quite straightforward: the internet needs to be designed for safety. That means it must be age appropriate, and the technology companies need to do something about it. I support the amendments very strongly.
My Lords, I, too, support my noble friend Lady Kidron. Last week, with her and my noble friend Lord Best, I was able to attend a briefing session with the right honourable Karen Bradley, the Secretary of State. I found that very helpful. We were looking at the Green Paper on internet safety published on 11 October. It is curious that we are here in Committee talking about some of the same issues when that significant consultation is being undertaken by the Government. I hope that when the noble Lord, Lord Ashton of Hyde, comes to reply to the debate, he will say something about how the Government intend to synchronise the discussion of and consultation on the Green Paper that is under way with the moving horse of legislation that is proceeding through your Lordships’ House.
During our discussions last week, my noble friend raised again the duty to protect. I agree with what the noble Lord, Lord Knight, just said about this providing an elegant way forward. I guess that many of us would want to turn the clock back if that were possible, but we recognise that it is not, and this may well be, therefore, a better way to proceed. It is certainly one to which the Government should be giving considerable attention.
While I am on my feet, perhaps I may remind the noble Lord, Lord Ashton, of the amendment that I moved with my noble and learned friend Lady Butler-Sloss during the debate in April on the digital legislation. I particularly draw his attention to col. 40 on 20 March and the remarks made by his right honourable friend the Minister of State for Digital in the other place on 26 April, when he described the question of prohibited material and definitions, which we had argued should be consistent across varying media platforms. They both said that this was unfinished business that would be returned to. I have studied the Green Paper but have not been able to find the solution to that unfinished business, and wonder whether it will be addressed as the legislation proceeds.
Perhaps I may also ask the Minister about the protection of minors. It has been stated again and again, by all noble Lords who have participated so far, including the noble Lord, Lord Storey, that the protection of children should be a paramount consideration at all times. The Minister may recall the case, which I raised with the Secretary of State and in your Lordships’ House, of some young people who had visited suicide sites. I was horrified to learn from the headmaster of a school in Lancashire, where I arrived to distribute prizes, that a child who had visited a suicide site had taken their own life only that morning. What further protections are being provided to require service providers, for whom self-regulation is clearly not enough, to do rather more about that question?
It has been said that parents do not have a chance in this situation; that is absolutely right. As my noble friend Lady Hollins said, young people spend a vast amount of time on the internet. Many parents do not understand how it works. It is therefore crucial that we do all we can to place pressure on the service providers. I remind the House of the advice that Aristotle gave parents. He said that only a bad parent would place their children in the hands of a foolish storyteller. I fear that many of us, maybe inadvertently and without knowing the full consequences of placing our children in the hands of the Twittersphere and the digital world, with all the information that pours into their minds on a massive scale, have placed them into bad hands. We need to do more to protect them. This is what my noble friend is trying to do and I commend her amendment to the House.
My Lords, I support the aim of these amendments, as do other noble Lords who have spoken. They were extraordinarily well introduced, given the scope of what they are intended to achieve. As I said at Second Reading, I do not have the same authority and technical background in the industry as many noble Lords who have taken part, particularly the noble Baroness, Lady Harding. However, I have a legitimate question for the noble Baroness. The Minister, who will have heard the general support around the House, will also be aware of this. However good the intentions of the amendments—and I support their aims—it is difficult to regulate in a world in which technical capacity is international. As the noble Baroness, Lady Harding, said, these matters are rather low on the agendas of the major, global corporations which are responsible for producing the technology, delivering the content and organising the platforms that children may be accessing, appropriately or not. It is legitimate to ask, as she did, whether what we say and how we regulate in this country can be a beacon. I think she said that this could be the beginning of a geographical spread of better regulation. It would be pointless to ignore the fact that we are dealing not with an internal issue of domestic regulation as we would be with terrestrial broadcasting, but with global corporations, most of them based on the west coast of the United States, which do not necessarily even agree with the aims of these amendments—which I very certainly do.
There may be some confusion now. I am not saying that children’s data is not important or that data protection for children is not important: clearly they are. However, the internet safety strategy addresses an overall, comprehensive range of measures that is about more than just data protection. We want to have a comprehensive strategy, which I am going to come to, to talk about safety. Nobody in their right mind is saying that we should not protect children, not only on the domestic front but internationally, as the noble Baroness, Lady Jay, said. Let me continue and I am sure all will become clear. If it does not, I am sure that the noble Baroness and others will cross-question me. If I have misunderstood what the noble Lord, Lord Knight, is getting at, I will look at Hansard and get back to him. I am sure we will come to this again.
We have a clear plan of action to raise the level of safety online for all users, as set out in the internet safety strategy. We are consulting on a new code of practice for the providers of online social media platforms, as required by the Digital Economy Act. That will set best practice for platform providers in offering adequate online protection policies, including minimum standards. Approaching the problem in this way as a safety matter, rather than a data protection matter, ensures we can tackle the problem while avoiding a debate over whether we are compliant with the GDPR. The internet safety strategy also outlines the Government’s promotion of “Think safety first” for online services. This will aim to educate and encourage new start-ups and developers to ensure that safety and privacy are built into their products from the design phase. Examples of this type of approach include having robust reporting mechanisms for users. We are looking at whether extra considerations should be in place on devices that are registered as being used by a child.
It is essential that we take a careful and considered approach to affecting the design standard of online services. Making overly complex or demanding requirements may result in negative consequences. Let me explain why. Amendments 18 and 19 essentially offer website operators a stark choice. Websites will need to either invest in upgrading standards and design or withdraw their services for use by under-16s. This is dangerous for the following reasons.
First, it could cause a displacement effect where children move to less popular platforms that would potentially not comply with such requirements—the noble Baroness, Lady Jay, talked about foreign sites. It is often more difficult to monitor these services and to ensure they have the basic protections that we expect from more legitimate sites. Platforms comply either because they are responsible or because they believe that the regulator will take enforcement action against them. Platforms hosted overseas may not always comply, because to do so would reduce the volume of users and potential monetisation, and the risk of enforcement action may be low.
Secondly, it is likely that young people, particularly those who already use these sites, may lie about their age to circumvent restrictions. This could have negative consequences for the prosecution of online grooming and underage sex: teenagers would be vulnerable to the assumption that they are over 16; adults could use this as a defence for their conduct; and sites may not be as accountable for the content that children are exposed to. This is not an imaginary problem. There have been cases of acquittal at trial, where men have had sexual relations with underage girls after meeting them on sites for over-18s only, using their presence on the site as a defence for believing them to be adults.
Thirdly, circumvention may be sought through the use of mechanisms to anonymise—I am having a problem with my pronunciation too—the use of the internet. Young people may adopt anonymising tools such as VPNs to access non-UK versions of the sites. This would make it more difficult for law enforcement to investigate, should they be exploited or subject to crime.
Fourthly, there is already in place a variety of legislation to safeguard children. Any change brought in through this Bill would have potential ramifications for other statutes. Altering how children make use of online service providers would need to be carefully worked through with law enforcement agencies to ensure that it did not damage the effectiveness of safeguarding vulnerable people.
Fifthly, these amendments do not just apply to social media services. A broad range of online services would be affected by this proposal, from media players to commerce sites. The kinds of services that would be caught by this amendment include many that develop content specifically for young people, including educational materials, not to mention the wider impact on digital skills if children are forced offline.
I move on now to more practical considerations. I am concerned that the amendments as drafted, while an elegant proposal, could serve to create confusion about what sites have to do. We know that the GDPR will apply from 25 May, and I am not convinced that this will allow enough time for the commissioner to consult on the guidance, prepare it, agree it and lay it before Parliament, and for companies to be compliant with it. Online service providers will need to adhere to the new requirements from May 2018, and may have existing customers that the new provisions will apply to. They will need some time to make any necessary changes in advance. Even with the transition period available in the amendment, this would lead to considerable uncertainty and confusion from online services about the rules they will have to follow come May. This could result in the problems that I have already laid out.
Finally, the Information Commissioner has raised a technical point. These amendments would apply only where consent is the lawful basis for processing data. Children also have access to online services where the data controller relies on a contractual basis or vital interests to offer services, rather than reliance on consent. Therefore, the amendments may have less reach than seems to be envisaged and are likely to lead to confusion as to which services the requirements apply to.
In summary, in spite of our appreciation of the aims of these amendments, we have concerns. They may prove dangerous to the online safety of children and young people. Creating unnecessary and isolated requirements runs the risk of being counterproductive to other work in this space. There needs to be some serious and detailed discussion on this before any changes are made. Furthermore, the technical and legal drafting of the amendments remains in question.
There is no doubt that further work needs to be done in the online safety space to ensure the robust and sustainable protection of our children and young people online. We have demonstrated commitment to this through the work on the internet safety strategy and the Digital Economy Act. We are working on these issues as a matter of priority, but strongly believe that it is better to address them as a whole rather than pursue them through the narrow lens of data protection. We need to work collaboratively with a wide range of stakeholders to ensure that we get the right approach. The noble Baroness, Lady Kidron, for example, was among those who attended the parliamentarians’ round table on the internet safety strategy, which she mentioned, hosted by the Secretary of State last week. We are engaged on this issue and are not pursuing the work behind locked doors. These specific amendments, however, are not the right course of action to take at this time.
My Lords, the Minister has just referred to the round table. He will recall that I mentioned in my remarks the issue of definitions and suicide sites that were raised during that round table last week. Can he tell the House any more about that?
I was not at the round table, and I am afraid that I would require some notice to answer that question. I am certainly happy to write to the Committee about that. I had not forgotten; I just do not have an answer.
Given the arguments that I have laid out, I would like to reassure the House that this issue remains high priority. The noble Lord, Lord Knight, asked whether GOV.UK’s Verify site could be used for age verification. Verify confirms identity against records held by mobile phone companies, HM Passport Office, the DVLA and credit agencies, so it is not designed for use by children. We will continue to work with interested parties to improve internet safety, but in a coherent and systematic way. For the moment, and in anticipation of further discussions, I ask the noble Baroness to withdraw her amendment.
I now move to Amendment 20A from the noble Lords, Lord Stevenson and Lord Kennedy, on the requirement for a review of Clause 8. Again, the Government agree with the spirit of this amendment in ensuring that the legislation we are creating offers the protections that we desire. However, there are a few issues that we would like to address.
First, it is government practice to review and report in cases of new legislation like this. Bringing about a mandatory report in this case is therefore unnecessary. Furthermore, prescribing the specific content of such a report at this stage is counterproductive. This is especially true given the complex and wide-ranging nature of child online safety and the work being conducted by the Government in this space.
Secondly, on timings, as noble Lords are aware, we must comply with the GDPR from 25 May next year, by which time the Bill must be passed. I am concerned, therefore, that to require a review to be published within 12 months of the Bill passing would not leave sufficient time to produce a meaningful report. Companies need the time to bring in new mechanisms to be compliant with the regulation. For data to be created and collected, time must be given for the sites to be tested and used following the new regulations. This will allow for the comparison of robust data and that which will reflect other work around online safety, which is still being developed. For those reasons, I ask the noble Lords not to press their amendments.