Lord Knight of Weymouth debates involving the Department for Digital, Culture, Media & Sport during the 2017-2019 Parliament

Mon 6th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 2nd sitting (Hansard): House of Lords

Online Harms White Paper

Lord Knight of Weymouth Excerpts
Tuesday 30th April 2019

(4 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, it is a pleasure to follow the noble Baroness, Lady Howe, and to precede the noble Baroness, Lady Benjamin, both of whom have consistently campaigned on the dangers of the internet to children. I agree with what the right reverend Prelate the Bishop of St Albans said on gambling; I would support a ban on advertising at football matches.

By way of reminding your Lordships of my interests, particularly as a chief officer at TES, a digital education business, and as chair of xRapid, a health tech business, I will start by reminding the House of the upside of the online world. TES has 11.5 million registered users, and, as a platform for teachers, facilitates the global sharing of teaching resources. This saves teachers buckets of time and helps them access a torrent of quality user-generated content. It is inconceivable without the internet. My other interest trains iPhones to do the work of microscopists in diagnosing malaria, which we are now able to give away to those who need it—laboratory quality at zero marginal cost, thanks to online technology. There are many other examples of technology for good, and if we do not grasp them but instead allow our public services to stagnate, we will be left behind as other nations leapfrog our development.

However, the harm of the internet is also a reality. Many of us are working out how to manage it. I am guilty of normally overindulging on my screen time—I am digitally obese. At home, our seven year-old, Coco, asked us just this week whether we can agree as a family our own code for gaining consent if we want to post images of each other on social media. That is a job for this weekend. But there are areas where self-regulation will not apply and where we need urgent government and legislative action.

I urge your Lordships to take 15 minutes to watch Carole Cadwalladr’s brave TED talk, delivered earlier this month in Vancouver. As the journalist who uncovered the Cambridge Analytica scandal, she has credibility in her charge that our democracy has been broken by Facebook. Her argument is compelling. Communities such as Ebbw Vale, with very few immigrants, voted overwhelmingly for Brexit because of their fear of immigration. Such communities are not consumers of the mainstream, right-wing media that stir that particular pot, but they are consumers of Facebook. She describes Facebook as a “crime scene”, where the likes of Nigel Farage were able to oversee what she uncovered. Who knows how much money from who knows where was able to fund of a firehose of lies through Facebook ads. These were targeted at those who were most vulnerable to believing them, using the illegal hack of personal data from tens of millions of users.

The online harm to individuals, as other noble Lords have talked about, is profound, but there can be no greater harm to a nation state than the catastrophe of Brexit, brought about by referendum won by illegal campaigning—and we allow Nigel Farage to start another party to dupe the nation once more. We desperately need to update our electoral law to prevent this destruction of our democracy, and I hope that the legislation following this White Paper may present some opportunities for us to do so.

I must also say that I commend this White Paper. I inevitably want it to go further, but the core proposals of a duty of care and of a regulator are sound. As the manager of a TES resources platform, I welcome those regulatory burdens. I am particularly delighted to see the duty of care principle. For some time I have been keen to see this well-established legal principle from the physical world come into the virtual world. I was introduced to the notion by Will Perrin and I pay tribute to him and his collaborators at the Carnegie Trust, and to the Government for listening to them. My assumption has been that, when applied, this will generate civil action in the courts by victims against technology operators for the damage caused by their algorithms and other relevant actions. Can the Minister say whether this will be available under the government plans, or will redress be available only through the regulator?

Speaking of victims of algorithms, I am also interested in whether the measures here will apply to the Government themselves and other public bodies. Can the Minister please help me? I have spoken before about the worrying case of the sentencing algorithm used in Wisconsin courts that defence attorneys were prevented from examining. We have had another example closer to home. Last year it came to light that our Home Office had deported potentially thousands of students, using a contractor analysing voice recordings with a machine. They asked the Educational Testing Service to analyse voice files to work out if students were using proxies to sit the English tests for them, and an immigration appeal tribunal in 2016 heard that when ETS’s voice analysis was checked with a human follow-up, the computer had been correct in only 80% of cases—meaning that some 7,000 students had their visas revoked in error.

Given what we know about algorithmic bias, and the growing use of algorithms for public service delivery, it is critical that public bodies are also subject to the measures set out in the White Paper. I would also say that, since the Government are increasingly building technology platforms to compete with the private sector, it would be unfair not to impose the same regulatory burdens upon them as there are on those of us working in the commercial world.

My final point relates to the valid point made in the document that technology can be part of the solution. I agree. But there is a danger that the demands placed on technology companies will assume that they are all of the size and wealth of Facebook, Amazon, Google and Apple. This would be a mistake. They can afford to develop solutions and gain a competitive advantage over smaller businesses as a result. We need to ensure that these measures result in a more, not less, competitive landscape. If there are technology solutions to solve difficult problems such as the copyright infringements that I grapple with or other thefts of intellectual property, those tools should be openly available to platform providers of all sizes.

When Sir Tim Berners-Lee invented the web he had a great vision that it should be for everyone. Earlier this month he said that the internet,

“seemed like a good idea at the time”,

that the world was certainly better for it, but that,

“in the last few years, a different mindset has emerged”.

At the 30-year point, people have become worried about their personal data, but they,

“didn’t think about it very much until Cambridge Analytica”.

The privacy risk, however, “is subtle”, he argued:

“It’s realising that all this user generated data is being used to build profiles of me and everyone like me—for targeted ads and more importantly, voting manipulation. It’s not about the privacy of photographs, but where my data is abused”.


We need new duties on technology companies and we need a regulator with teeth. I wish the Government well and I hope that we will see legislation on this very soon.

Social Media: News

Lord Knight of Weymouth Excerpts
Thursday 11th January 2018

(6 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, the noble Baroness, Lady Kidron, made her case brilliantly, and I join others in saluting the leadership that she is showing the House by raising these issues and getting action in the Data Protection Bill. Her vision is exactly the opposite of that we see in the United States from the FCC in ending net neutrality, which is taking that country into such a dark place in this context.

I remind the House of my interests in respect of my employment at TES Global, which is both a publisher of what used to be known as the Times Educational Supplement and the provider of a substantial platform for teachers to find jobs and share teaching resources, so I find myself on both sides of the false argument, as described by the noble Baroness, Lady Harding. Indeed, I am also an avid user of social media. On the way in here, LinkedIn told me to congratulate one Ray Collins on seven years working at the House of Lords, which I am of course happy to do.

Social media and user-generated content are here to stay. I do not believe it is possible to pre-moderate all the content shared all the time. If we were to ask social media companies to do so, it would be an extraordinary barrier to entry for anyone wanting to create competition in this space, which I think we would want if we want the sort of new tools and platforms described by the noble Baroness, Lady Scott. But I do not disagree that we need to do much better, especially on content accessible by children. It needs new policy thinking and a regulatory solution that respects the consumers’ desire to share digital content and their need for trusted content and providers. My view is that tech companies are media companies, but that does not mean that the regulatory regime for traditional media is appropriate or in the public interest. Like the noble Baroness, Lady Harding, I think we need a new regulatory regime for online service providers.

The media need to keep evolving their business model, and will need new models of regulation, away from monetising content to generating traffic and data for other purposes within the business, in a new environment regulated by the GDPR. I also think that it is in the interests of the likes of Facebook to ensure that advertising revenue is fairly shared with the media companies whose content is widely shared for free on their platforms. I am told that they are having those conversations; I hope that we will get concrete action.

I also echo the point made by the noble Baroness, Lady Harding, about engineering time. Perhaps Google should extend its famous 20% time to a percentage, let us say even 10%, of its engineers’ time to help solve some of these problems.

The issue of fake news is, of course, ringing large in our ears. I am interested in how France’s President Macron is suggesting regulation of social media platforms during election periods. Perhaps we could restrict sharing to Ofcom-regulated news outlets, I do not know. We will have to see how that and the German experiment work.

Fundamentally, I would love to have a counterpoint on social media to my echo chamber. If I could press a button and see that, it would help my sense of what is going on on the other side. We need accountability, not always up to regulators but sometimes down to users. We will get some thanks to the Data Protection Bill. There is the right to explain and some data mobility measures provide accountability. I am also interested in data trusts and the politics of data. Perhaps we will end up needing collective action, the equivalent of a new digital trade union movement for platform users so that we can impose some data ethics and withdraw our data from the companies that are so hungry for it unless they give us the ethical safeguards that we need.

Finally, I echo what my noble friend Lord Puttnam said about more study of this in schools and in wider society. Then, perhaps, we can have an informed debate to find an imaginative policy solution to these pressing issues.

Data Protection Bill [HL]

Lord Knight of Weymouth Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I support the amendments. I remind the House of my interests in relation to my work at TES, the digital education company.

The noble Baroness, Lady Kidron, and the others who have supported the amendment have given the Government a pretty neat way out of the problem that 13 as the age of consent for young people to sign up to “information society services”, as the Bill likes to call them, feels wrong. I have found that for many Members of your Lordships’ House, 16 feels like a safer and more appropriate age, for all the reasons that the noble Lord, Lord Storey, has just given in terms of defining when children are children. There is considerable discomfort about 13 in terms of where the Bill currently sits.

However, I think many noble Lords are realists and understand that to some extent the horse has bolted. Given the huge numbers of young people currently signing up to these services who are under 13, trying to pretend that we can find a way of forcing the age up to 16 from the accepted behavioural norm of 13 looks challenging. Yet we want to protect children. So the question is whether these amendments would provide that solution. That hinges on whether it is reasonable to ask the suppliers of information society services to verify age, and whether it is then reasonable to ask them to design in an age-appropriate fashion. From my experience, the answer to both is yes, it is. Currently, all you do is tick a box to self-verify that you are the age you are. If subsequently you want to have your data deleted, you may have to go through a whole rigmarole to prove that you are who you are and the age you say you are, but for some reason the service providers do not require the same standard of proof and efficacy at the point where you sign up to them. That is out of balance, and it is effectively our role to put it back into balance.

The Government themselves, through the Government Digital Service, have an exceedingly good age-verification service called, strangely, Verify. It does what it says on the tin, and it does it really well. I pay tribute to the GDS for Verify as a service that it allows third parties to use: it is not used solely by Government.

So age verification is undoubtedly available. Next, is it possible—this was explored in previous comments, so I will not go on about it—for age-appropriate design to be delivered? From our work at TES, I am familiar with how you personalise newsfeeds based on data, understanding and profiling of users. It is worth saying, incidentally, that those information society services providers will be able to work out what age their users are from the data that they start to share: they will be able to infer age extremely accurately. So there is no excuse of not knowing how old their users are. Any of us who use any social media services will know that the feeds we get are personalised, because they know who we are and they know enough about us. It is equally possible, alongside the content that is fed, to shift some aspects of design. It would be possible to filter content according to what is appropriate, or to give a slightly different homepage, landing page and subsequent pages, according to age appropriateness.

I put it to the Minister, who I know listens carefully, that this is an elegant solution to his problem, and I hope that he reflects, talks to his colleague the right honourable Matthew Hancock, who is also a reasonable Minister, and comes back with something very similar to the amendments on Report, assuming that they are not pressed at this stage.

Baroness Hollins Portrait Baroness Hollins (CB)
- Hansard - - - Excerpts

My noble friend made a very strong case. The internet was designed for adults, but I think I am right in saying that 25% of time spent online is spent by children. A child is a child, whether online or offline, and we cannot treat a 13 year-old as an adult. It is quite straightforward: the internet needs to be designed for safety. That means it must be age appropriate, and the technology companies need to do something about it. I support the amendments very strongly.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

I apologise to the Minister for interrupting. I am just interested in that confusion that he talks about. Perhaps I am incorrect, but I understand that images, for example, are data. There is a lot of concern about sexting and about platforms such as Snapchat and the sharing of data. Where is the confusion? Is it in the Government, or in the Chamber?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I do not think I mentioned confusion. What we are talking about in the Bill is purely data protection. We are talking about the age at which children can consent to information society services handling their data. What I think the noble Baroness, and a lot of Peers in the House, are talking about is keeping children safe online, which is more than just protection of their personal data.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am happy to confirm those two points. On extraterritoriality, I agree with the noble Baroness that it is difficult to control. Commercial sites are easier—an example of which is gambling. We can control the payments, so if they are commercial and cannot pay people, they may well lose their attractiveness. Of course, the only way to solve this is through international agreement, and the Government are working on that. Part of my point is that, if you drive children away to sites located abroad, there is a risk in that. The big, well-known sites are by and large responsible. They may not do what we want, but they will work with the Government. That is the thrust of our argument. We are working with the well-known companies and, by and large, they act responsibly, even if they do not do exactly what we want. As I say, however, we are working on that. The noble Baroness is right to say that, if we drive children on to less responsible sites based in jurisdictions with less sensible and acceptable regimes, that is a problem.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

Could the Minister help me with any information he might have about when the GDPR was drawn up? It must have been envisaged when Article 8 was put together that some member states would go with something different—be it 13, 16, or whatever. The issue of foreign powers must have been thought about, as well as verifying age, parental consent, or the verification of parental identity to verify age. Article 8 just talks about having to have parental sign-off. These issues of verification and going off to foreign powers must have been thought about when the article was being put together in Europe. Does he have any advice on what they thought would be done about this problem?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I cannot give the noble Lord chapter and verse on what the European bureaucrats were thinking when they produced the article, but age verification is not really the issue on this one, because it is extremely difficult to verify ages below 18 anyway. Although one can get a driving licence at 17, it is at the age of 18 when you can have a credit card. As I say, the issue here is not age verification—rather, it is about how, when we make things too onerous, that has the potential to drive people away on to other sites which take their responsibilities less seriously. That was the point I was trying to make.

--- Later in debate ---
We are therefore at an important time. By agreeing this amendment, we can ensure that PSHE will be the vehicle by which these issues can be taught to all children in all schools. I hope that when we come to Report the Minister will be able to report that that will be the case. I beg to move.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

My Lords, does the Minister agree with the noble Lord, Lord Storey, that PSHE would be the most appropriate way to educate young people about data rights? If so, I note that the Secretary of State, Justine Greening, has today announced that Ian Bauckham will lead the review on how relationship and sex education for the 21st century will be delivered. Can the Minister, who is clearly prepared to think about this appointment today, ask whether it is within his scope to think about how data rights education may be delivered as part of that review, and whether the review will draw on the work of the previous person who reviewed the delivery of PSHE, Sir Alasdair Macdonald, the last time Parliament thought that compulsory SRE was a good idea?

Baroness Kidron Portrait Baroness Kidron
- Hansard - - - Excerpts

I support the amendment. I was on the House of Lords Communications Committee, to which the noble Lord just referred. We recommended that digital literacy be given the same status as reading, writing and arithmetic. We set out an argument for a single cross-curricular framework of digital competencies—evidence-based, taught by trained teachers—in all schools whatever their legal status.

At Second Reading, several noble Lords referred to data as the new oil. I have been thinking about it since: I am not so certain. Oil may one day run out; data is infinite. What I think we can agree is that understanding how data is gathered, used and stored, and, most particularly, how it can be harnessed to manipulate both your behaviour and your digital identity, is a core competency for a 21st-century child. While I agree with the noble Lord that the best outcome would be a single, overarching literacy strategy, this amendment would go some small way towards that.

Data Protection Bill [HL]

Lord Knight of Weymouth Excerpts
Monday 30th October 2017

(6 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness for that accolade. I rise to speak to Amendment 170, which is a small contribution to perfecting Amendment 169. It struck me as rather strange that Amendment 152 has a reference to charities, but not Amendment 169. For charities, this is just as big an issue so I wanted to enlarge slightly on that. This is a huge change that is overtaking charities. How they are preparing for it and the issues that need to be addressed are of great concern to them. The Institute of Fundraising recently surveyed more than 300 charities of all sizes on how they are preparing for the GDPR, and used the results to identify a number of areas where it thought support was needed.

The majority of charities, especially the larger ones, are aware of the GDPR and are taking action to get ready for May 2018, but the survey also highlighted areas where charities need additional advice, guidance and support. Some 22% of the charities surveyed said that they have yet to do anything to prepare for the changes, and 95% of those yet to take any preparatory action are the smaller charities. Some 72% said that there was a lack of clear available guidance. Almost half the charities report that they do not feel they have the right level of skills or expertise on data protection, and 38% report that they have found limits in their administration or database systems, or the costs of upgrading these, a real challenge. That mirrors very much what small businesses are finding as well. Bodies such as the IoF have been working to increase the amount of support and guidance on offer. The IoF runs a number of events, but more support is needed.

A targeted intervention is needed to help charities as much as it is needed for small business. This needs to be supported by government—perhaps through a temporary extension of the existing subsidised fundraising skills training, including an additional training programme on how to comply with GDPR changes; or a targeted support scheme, directly funded or working with other funding bodies and foundations, to help the smallest charities most in need to upgrade their administrative or database systems. Charities welcome the recently announced telephone service from the ICO offering help on the GDPR, which they can access, but it is accessible only to organisations employing under 250 people and it is only a telephone service.

There are issues there, and I hope the Minister will be able to respond, in particular by recognising that charities are very much part of the infrastructure of smaller organisations that will certainly need support in complying with the GDPR.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I broadly support what these interesting amendments are trying to do. I declare my interest as a member of the board of the Centre for Acceleration of Social Technology. Substantially, what it does is advise normally larger charities on how to best take advantage of digital to solve some of their problems.

Clearly, I support ensuring that small businesses, small charities and parish councils, as mentioned, are advised of the implications of this Act. If she has the opportunity, I ask the noble Baroness, Lady Neville-Rolfe, to explain why she chose staff size as the measure. I accept that hers is a probing amendment and she may think there are reasons not to go with staff size. The cliché is that when Instagram was sold to Facebook for $1 billion it had 13 members of staff. That would not come within the scope of the amendment, but there are plenty of digital businesses that can achieve an awful lot with very few staff. As it stands, my worry is this opens up a huge loophole.

Lord Maxton Portrait Lord Maxton (Lab)
- Hansard - - - Excerpts

I entirely agree with my noble friend. The point I was going to make is that small companies are often very wealthy. In the global digital world that is the fact: you do not need the same number of employees as in the past. Equally, would the amendment apply to five employees globally, or just in this country?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

Certainly if the amendment were to have any legs in terms of using the number of employees as a parameter then that would have to be defined. However you chose to define the size of an organisation, you would need to explore how to work that out.

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe
- Hansard - - - Excerpts

I chose five employees because it often denotes a small organisation or a small business. I can see that some of the businesses in that category might be fairly large. I would of course have no objection to adding an extra criterion, such as turnover, if there was a mood to write exemptions into the Bill. Other legislation has exemptions for smaller bodies. The overall objectives of the data protection legislation clearly have to be achieved but I am concerned that, in particular, some of the subsidiary provisions, such as fines and fees, which I mentioned, are demanding and worrying for smaller entities.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

I am grateful for the noble Baroness’s comments. Something certainly can be done to think more about turnover than the number of employees, otherwise there would be a big loophole, particularly around marketing and being able to set up a company to harvest data, for which the Act would not apply. It could then sell the data on. It would not need very many people at all to pursue that opportunity.

The other thing these amendments allow us to do is ask the Minister to enlighten us a little on his thinking about how the Information Commissioner’s role will develop. In particular, if it is to pursue the sorts of education activities set out in these amendments, how will it be resourced to do so? I know there are some career-limiting aspects for Ministers who promise resources from the Dispatch Box, but the more he can set out how that might work, the more welcome that would be.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I declare my interests as a chairman of a charity and of a not-for-profit organisation, and as a director of some small businesses. Having said that, I agree with every word that my noble friend Lady Neville-Rolfe said.

The Association of Accounting Technicians has said that the notion that the GDPR will lead to a €2.3 billion cost saving for the European Union is absurd. I agree. The Federation of Small Businesses has said how a sole trader might have to pay £1,500 for the work needed, and someone with 25 employees might have to pay £20,000. In the Second Reading debate my noble friend Lord Marlesford talked about his parish council rather poignantly. It might be impossible to exempt organisations such as those from European Union regulations. But if that is so, I hope that my noble friend the Minister will say, first, why it is impossible; and, secondly, what we can do to get round and to ameliorate the various different issues raised.

On the duty to advise Parliament of the consequences of the Bill, I said at Second Reading that the regulator cannot issue guidance until the European Data Protection Board issues its guidance. That may not be until spring next year. This leaves businesses, charities and parish councils very little time, first, to make representations to Parliament; secondly, to bring in new procedures; and thirdly, to train the staff they will need. In that short time, organisations will all be competing for very skilled staff. That must push the price of those skilled staff up at a time when these small businesses will find it very difficult to pay.

I look forward with interest to hearing what my noble friend says, and I hope that he will be able to agree to the meeting that my noble friend asked for.

--- Later in debate ---
Baroness Chisholm of Owlpen Portrait Baroness Chisholm of Owlpen (Con)
- Hansard - - - Excerpts

My Lords, the Bill creates a comprehensive and modern framework for data protection in the UK. The importance of these data protection standards continues to grow—a point which has not been lost on noble Lords; nor has it been lost on organisations, business groups and others. We are grateful for all the feedback we have received through responses to the Government’s call for views and on our statement of intent, and, most recently, on the drafting of the Bill itself. Hence this large group of technical amendments seek to polish various provisions of the Bill in response to that feedback. If I may, I will save noble Lords from the tedium of going through each amendment in turn—we would be here all night—and instead focus on the small number of substantive amendments in the group.

I begin with Amendment 51, which ensures that automatic renewal insurance products purchased before 25 May 2018 can continue to function. Automatic renewal products work on the principle that, if the insured person does not respond to the renewal notice, their insurance continues uninterrupted. Without the amendment this would not be possible for products such as motor insurance, which require processing of special categories of personal data and criminal convictions and offences data, potentially leaving individuals unwittingly uninsured.

Amendment 55 responds to a request from the Welsh Government to extend an exemption on passing information about a prisoner to an elected representative to Members of the Welsh Assembly. I am very happy to give effect to that request.

Amendment 56 ensures that existing court reporting—so important for ensuring open justice—can continue. Judgments may include personal data, so this amendment will allow the courts to continue with current reporting practices.

Paragraph 9 of Schedule 2 provides a limited exemption in respect of certain regulatory activities which could otherwise be obstructed by a sufficiently determined individual. Amendment 86 adds five additional regulatory activities to that list to allow relevant existing data processing activities to continue.

Amendment 87 extends the common-sense protection provided by paragraph 22 of Schedule 2 for confidential employment references, so that it also expressly covers confidential references given for voluntary work.

Amendments 90 and 186 ensure a consistent definition of “publish” and “publication” throughout the Bill.

I conclude my brief tour—it did not seem very brief to me—of these amendments with reference to the amendments to Schedule 6. As noble Lords will recall, in creating the applied GDPR Schedule 6 anglicises its language, so as to ensure that it makes sense in a UK context. This is a mechanical process involving, for example, replacing the term “member state” with “United Kingdom”. Amendments 112 to 114, 116 to 118 and 120 to 124 refine that process further.

The remaining amendments that I have failed to mention will dot the “i”s and cross the “t”s, as detailed in the letter from my noble friends Lord Ashton and Lady Williams when the amendments were tabled on 20 October. For these reasons, I beg to move Amendment 8 and ask the House to support the other government amendments in this group.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

My Lords, I will be brief on this group but I have two points to make. One is a question in respect of Amendment 51, where I congratulate the insurance industry on its lobbying. Within proposed new paragraph 15A(1)(b) it says,

“if … the controller has taken reasonable steps to obtain the data subject’s consent”.

Can the Minister clarify, or give some sense of, what “reasonable” means in this context? It would help us to understand whether that means an email, which might go into spam and not be read. Would there be a letter or a phone call to try to obtain consent? What could we as citizens reasonably expect insurance companies to do to get our consent?

Assuming that we do not have a stand part debate on Clause 4, how are the Government getting on with thinking about simplifying the language of the Bill? The noble Baroness, Lady Lane-Fox, is temporarily not in her place, but she made some good points at Second Reading about simplification. Clause 4 is quite confusing to read. It is possible to understand it once you have read it a few times, but subsection (2) says, for example, that,

“the reference to a term’s meaning in the GDPR is to its meaning in the GDPR read with any provision of Chapter 2 which modifies the term’s meaning for the purposes of the GDPR”.

That sort of sentence is quite difficult for most people to understand, and I will be interested to hear of the Government’s progress.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness for introducing these amendments in not too heavy a style, but this is an opportunity to ask a couple of questions in relation to them. We may have had since 20 October to digest them; nevertheless, that does not make them any more digestible. We will be able to see how they really operate only once they are incorporated into the Bill. Perhaps we might have a look at how they operate on Report.

The Bill is clearly a work in progress, and this is an extraordinary number of amendments even at this stage. It begs the question as to whether the Government are still engaged in discussions with outside bodies. Personally, I welcome that there has been dialogue with the insurance industry—a very important industry for us. We obviously have to make sure that the consumer is protected while it carries out an important part of its business. I know that the industry has raised other matters relating to third parties and so on. There have also been matters raised by those in the financial services industry who are keen to ensure that fraud is prevented. Even though they are private organisations, they are also keen to ensure that they are caught under the umbrella of the exemptions in the Bill. Can the noble Baroness tell us a little about what further discussions are taking place? It is important that we make sure that when the Bill finally hits the deck, so to speak, it is right for all the different sectors that will be subject to it.

--- Later in debate ---
I agree that the language can be very complicated and we are certainly working to make it understandable to everyone. We are still talking to stakeholders about issues that they may have. For instance, on the insurance amendment we talked to the ABI and Lloyds and worked with them when we drew up the amendment. We will carry on doing that with anybody who wishes to be in touch with us. I think that answers the questions asked by the noble Lord, Lord Clement-Jones. We are certainly still in touch with people.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

To clarify the question around insurance companies, if as technology and communications change there is a sense that the insurance companies should work a bit harder, would the first recourse be to go to the Financial Conduct Authority in order for it to regulate the insurance companies to do a better job?

Digital Understanding

Lord Knight of Weymouth Excerpts
Thursday 7th September 2017

(6 years, 7 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, this is another debate on digital led by the noble Baroness, Lady Lane-Fox, and yet another long list of speakers. Her leadership in this area is obvious. It is a pleasure to follow the noble Lord, Lord Baker. There is plenty I want to say in response to his speech, but that will have to wait until next Thursday’s debate in the name of the noble Baroness, Lady Stedman-Scott.

As the past chair and now patron of the Good Things Foundation, there is also much I would like to say relating to the need to narrow the divide in digital skills and understanding between the majority and the more than 10 million Britons without the skills and confidence to take advantage of the digital world. These are most likely to be older, poorer and disabled: the most vulnerable in our society.

I also remind your Lordships of my interests in the register, in particular my work with TES. In the analogue world, this was the Times Educational Supplement, but in its digital incarnation it minimises the number of characters used and is simply TES. That work has hugely helped my understanding of the power of digital to help the recruitment, training and resourcing of teachers.

I have also co-founded a business, xRapid, which uses the ability of a smartphone to recognise patterns through its camera lens, attached to a microscope, to diagnose malaria and count asbestos fibres. These machines are then able to learn from each other and thereby keep increasing the accuracy of the diagnosis.

Of course, these exciting forms of artificial intelligence need fuelling and their precious fuel is data, so that is what I will focus my remarks upon. This House will shortly be considering the data protection Bill. As the noble Baroness said, it is vital that enough of us have sufficient digital understanding to properly scrutinise and improve that legislation. In doing so, we need to pay special attention to those least able to understand and advocate for themselves.

My attention therefore turns to children: there is no demographic that has a greater need for improved digital understanding. Most parents struggle to advise their children on online safety, but they are also highly concerned to know that their child’s personal data are safe. We currently have little time in the school curriculum, which the noble Lord has just described, to teach children about data. We need to fix that, so that children know what information, images and videos are collected that are personal to them, why, by whom and for what use. What plans does DCMS have to engage children on this agenda?

Will the Minister talk to the DfE about this, and include a warning about the national pupil database? The NPD routinely collects highly sensitive data about all the nation’s children and shares them across government departments, with academics and with private companies. There is little transparency as to why it collects what it does, it is a workload pressure on teachers and I hope that the Minister can help them quickly address concerns about this data collection.

Our digital future is uncertain. With transparency, inclusion and understanding, we can progress with consent and confidence.

Data Ethics Commission

Lord Knight of Weymouth Excerpts
Monday 10th July 2017

(6 years, 9 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The data protection Bill, which will come before Parliament in the autumn, is to give effect to the general data protection regulations and the law enforcement directive. It will obviously include things to do with privacy, but data ethics covers many other things, such as artificial intelligence, which the noble Baroness mentioned. So it is not specifically a regulatory thing, although regulation may come out of it. It is to consider the new issues that come with this new technology.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, artificial intelligence has the potential to significantly empower us as humans, but comes with the worries that have been expressed. The noble Baroness, Lady Harding, mentioned parents. What plans do the Government have to engage children in this discussion about their data and their rights to the privacy of that data?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

That is of course important, and the data protection Bill will include measures to protect children and to allow data which is held by social media companies, for example, to be deleted. As for engaging children in considering these ethical issues, that is something that the data commission can consider but, as I said, we have not yet been specific about the structure, function and remit of the commission.