(5 years, 9 months ago)
Lords ChamberI agree with the right reverend Prelate, and that is something the Government are considering.
My Lords, I welcome the response that Matt Hancock has given to the father of 14 year-old Molly Russell, who took her life in 2017, having visited one of these suicide sites. That was a year in which the suicide rate among young females increased by 38%. As long ago as 7 December 2006, I asked the Government to amend the Suicide Act 1961 to enable the,
“banning of internet sites which may incite people to, or advise people on how to, commit suicide”.—[Official Report, 7/12/2006; col. WA 157.]
This is an issue I have raised on a dozen occasions since then, along with the noble Baroness, Lady Massey. While I welcome the White Paper and legislation, will the Minister confirm that this is an urgent issue, which ought to be dealt with as expeditiously as possible?
I agree that it is extremely important; we should expect social media companies to have responsibility, and we should hold them to account. The Secretary of State for Health and Social Care has met social media companies, and written to them on this issue. He had a round table on 7 February to discuss what more can be done, and his department will be hosting a follow-up round table in two months to review progress, so they are taking it seriously. In addition, bearing in mind what the right reverend Prelate said, we are thinking about those issues, as the noble Lord will see when the White Paper comes out.
(5 years, 10 months ago)
Lords ChamberI agree with the noble Lord. Part of the gambling licensing conditions that betting organisations have to abide by are that they should act responsibly and specifically not target young and vulnerable people. It is up to the Gambling Commission to make sure they abide by their licensing conditions.
My Lords, when the Minister looks at the effect of gambling on young people, will he take into account the survey conducted by ParentZone yesterday about a new phenomenon called skin gambling? It said:
“Our survey confirmed it is wide-spread, with 10% of children across the UK aged 13-18 revealing they have gambled skins in some form. This percentage amounts to approximately 448,744 children”.
This is surely one of the new phenomena now appearing in social media and elsewhere targeted at young people, and the Government need always to be ahead of the game in these kinds of circumstances.
The Government are aware of that, and when in-game items such as skins can be used to place a bet or gamble, and be converted into cash, it is considered gambling and requires a licence. The Gambling Commission has taken action and prosecuted unlicensed gambling of in-game items known as skins. We are seeking to work with the video games industry to raise awareness of that and explore solutions, but I take the noble Lord’s point. We are aware of gambling in games and it is a new issue of which we are taking account.
(7 years ago)
Lords ChamberThere may be some confusion now. I am not saying that children’s data is not important or that data protection for children is not important: clearly they are. However, the internet safety strategy addresses an overall, comprehensive range of measures that is about more than just data protection. We want to have a comprehensive strategy, which I am going to come to, to talk about safety. Nobody in their right mind is saying that we should not protect children, not only on the domestic front but internationally, as the noble Baroness, Lady Jay, said. Let me continue and I am sure all will become clear. If it does not, I am sure that the noble Baroness and others will cross-question me. If I have misunderstood what the noble Lord, Lord Knight, is getting at, I will look at Hansard and get back to him. I am sure we will come to this again.
We have a clear plan of action to raise the level of safety online for all users, as set out in the internet safety strategy. We are consulting on a new code of practice for the providers of online social media platforms, as required by the Digital Economy Act. That will set best practice for platform providers in offering adequate online protection policies, including minimum standards. Approaching the problem in this way as a safety matter, rather than a data protection matter, ensures we can tackle the problem while avoiding a debate over whether we are compliant with the GDPR. The internet safety strategy also outlines the Government’s promotion of “Think safety first” for online services. This will aim to educate and encourage new start-ups and developers to ensure that safety and privacy are built into their products from the design phase. Examples of this type of approach include having robust reporting mechanisms for users. We are looking at whether extra considerations should be in place on devices that are registered as being used by a child.
It is essential that we take a careful and considered approach to affecting the design standard of online services. Making overly complex or demanding requirements may result in negative consequences. Let me explain why. Amendments 18 and 19 essentially offer website operators a stark choice. Websites will need to either invest in upgrading standards and design or withdraw their services for use by under-16s. This is dangerous for the following reasons.
First, it could cause a displacement effect where children move to less popular platforms that would potentially not comply with such requirements—the noble Baroness, Lady Jay, talked about foreign sites. It is often more difficult to monitor these services and to ensure they have the basic protections that we expect from more legitimate sites. Platforms comply either because they are responsible or because they believe that the regulator will take enforcement action against them. Platforms hosted overseas may not always comply, because to do so would reduce the volume of users and potential monetisation, and the risk of enforcement action may be low.
Secondly, it is likely that young people, particularly those who already use these sites, may lie about their age to circumvent restrictions. This could have negative consequences for the prosecution of online grooming and underage sex: teenagers would be vulnerable to the assumption that they are over 16; adults could use this as a defence for their conduct; and sites may not be as accountable for the content that children are exposed to. This is not an imaginary problem. There have been cases of acquittal at trial, where men have had sexual relations with underage girls after meeting them on sites for over-18s only, using their presence on the site as a defence for believing them to be adults.
Thirdly, circumvention may be sought through the use of mechanisms to anonymise—I am having a problem with my pronunciation too—the use of the internet. Young people may adopt anonymising tools such as VPNs to access non-UK versions of the sites. This would make it more difficult for law enforcement to investigate, should they be exploited or subject to crime.
Fourthly, there is already in place a variety of legislation to safeguard children. Any change brought in through this Bill would have potential ramifications for other statutes. Altering how children make use of online service providers would need to be carefully worked through with law enforcement agencies to ensure that it did not damage the effectiveness of safeguarding vulnerable people.
Fifthly, these amendments do not just apply to social media services. A broad range of online services would be affected by this proposal, from media players to commerce sites. The kinds of services that would be caught by this amendment include many that develop content specifically for young people, including educational materials, not to mention the wider impact on digital skills if children are forced offline.
I move on now to more practical considerations. I am concerned that the amendments as drafted, while an elegant proposal, could serve to create confusion about what sites have to do. We know that the GDPR will apply from 25 May, and I am not convinced that this will allow enough time for the commissioner to consult on the guidance, prepare it, agree it and lay it before Parliament, and for companies to be compliant with it. Online service providers will need to adhere to the new requirements from May 2018, and may have existing customers that the new provisions will apply to. They will need some time to make any necessary changes in advance. Even with the transition period available in the amendment, this would lead to considerable uncertainty and confusion from online services about the rules they will have to follow come May. This could result in the problems that I have already laid out.
Finally, the Information Commissioner has raised a technical point. These amendments would apply only where consent is the lawful basis for processing data. Children also have access to online services where the data controller relies on a contractual basis or vital interests to offer services, rather than reliance on consent. Therefore, the amendments may have less reach than seems to be envisaged and are likely to lead to confusion as to which services the requirements apply to.
In summary, in spite of our appreciation of the aims of these amendments, we have concerns. They may prove dangerous to the online safety of children and young people. Creating unnecessary and isolated requirements runs the risk of being counterproductive to other work in this space. There needs to be some serious and detailed discussion on this before any changes are made. Furthermore, the technical and legal drafting of the amendments remains in question.
There is no doubt that further work needs to be done in the online safety space to ensure the robust and sustainable protection of our children and young people online. We have demonstrated commitment to this through the work on the internet safety strategy and the Digital Economy Act. We are working on these issues as a matter of priority, but strongly believe that it is better to address them as a whole rather than pursue them through the narrow lens of data protection. We need to work collaboratively with a wide range of stakeholders to ensure that we get the right approach. The noble Baroness, Lady Kidron, for example, was among those who attended the parliamentarians’ round table on the internet safety strategy, which she mentioned, hosted by the Secretary of State last week. We are engaged on this issue and are not pursuing the work behind locked doors. These specific amendments, however, are not the right course of action to take at this time.
My Lords, the Minister has just referred to the round table. He will recall that I mentioned in my remarks the issue of definitions and suicide sites that were raised during that round table last week. Can he tell the House any more about that?
I was not at the round table, and I am afraid that I would require some notice to answer that question. I am certainly happy to write to the Committee about that. I had not forgotten; I just do not have an answer.
Given the arguments that I have laid out, I would like to reassure the House that this issue remains high priority. The noble Lord, Lord Knight, asked whether GOV.UK’s Verify site could be used for age verification. Verify confirms identity against records held by mobile phone companies, HM Passport Office, the DVLA and credit agencies, so it is not designed for use by children. We will continue to work with interested parties to improve internet safety, but in a coherent and systematic way. For the moment, and in anticipation of further discussions, I ask the noble Baroness to withdraw her amendment.
I now move to Amendment 20A from the noble Lords, Lord Stevenson and Lord Kennedy, on the requirement for a review of Clause 8. Again, the Government agree with the spirit of this amendment in ensuring that the legislation we are creating offers the protections that we desire. However, there are a few issues that we would like to address.
First, it is government practice to review and report in cases of new legislation like this. Bringing about a mandatory report in this case is therefore unnecessary. Furthermore, prescribing the specific content of such a report at this stage is counterproductive. This is especially true given the complex and wide-ranging nature of child online safety and the work being conducted by the Government in this space.
Secondly, on timings, as noble Lords are aware, we must comply with the GDPR from 25 May next year, by which time the Bill must be passed. I am concerned, therefore, that to require a review to be published within 12 months of the Bill passing would not leave sufficient time to produce a meaningful report. Companies need the time to bring in new mechanisms to be compliant with the regulation. For data to be created and collected, time must be given for the sites to be tested and used following the new regulations. This will allow for the comparison of robust data and that which will reflect other work around online safety, which is still being developed. For those reasons, I ask the noble Lords not to press their amendments.