Digital Economy Bill (Second sitting) Debate
Full Debate: Read Full DebateLouise Haigh
Main Page: Louise Haigh (Labour - Sheffield Heeley)Department Debates - View all Louise Haigh's debates with the Cabinet Office
(8 years, 1 month ago)
Public Bill CommitteesQ 83 For this session we have until 2.45 pm. Will the witnesses please introduce themselves for the record?
David Austin: My name is David Austin. I am the chief executive of the British Board of Film Classification.
Alan Wardle: I am Alan Wardle, head of policy and public affairs at the National Society for the Prevention of Cruelty to Children.
Q David, am I right in interpreting the amendments that the Government tabled last night as meaning that you are intended to be the age verification regulator?
David Austin: That is correct. We reached heads of agreement with the Government last week to take on stages 1 to 3 of the regulation.
Q Are you sufficiently resourced to take on that role?
David Austin: We will be, yes. We have plenty of time to gear up, and we will have sufficient resource.
Q Will it involve a levy on the porn industry?
David Austin: It will involve the Government paying us the money to do the job on our usual not-for-profit basis.
Q What risks do you envisage in people handing over their personal data to the pornographic industry?
David Austin: Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, “Is this person 18 or over?” The answer should be either yes or no. No other personal details are necessary.
We should bear in mind that this is not a new system. Age verification already exists, and we have experience of it in our work with the mobile network operators, where it works quite effectively—you can age verify your mobile phone, for example. It is also worth bearing in mind that an entire industry is developing around improving age verification. Research conducted by a UK adult company in relation to age verification on their online content shows that the public is becoming much more accepting of age verification.
Back in July 2015, for example, this company found that more than 50% of users were deterred when they were asked to age verify. As of September, so just a few weeks ago, that figure had gone down to 2.3%. It is established technology, it is getting better and people are getting used to it, but you are absolutely right that privacy is paramount.
Q Are you suggesting that it will literally just be a question—“Is the user aged 18?”—and their ticking a box to say yes or no? How else could you disaggregate identity from age verification?
David Austin: There are a number of third-party organisations. I have experience with mobile phones. When you take out a mobile phone contract, the adult filters are automatically turned on and the BBFC’s role is to regulate what content goes in front of or behind the adult filters. If you want to access adult content—and it is not just pornography; it could be depictions of self-harm or the promotion of other things that are inappropriate for children—you can go to your operator, such as EE, O2 or Vodafone, with proof that you are 18 or over. It is then on the record that that phone is age verified. That phone can then be used in other contexts to access content.
Q But how can that be disaggregated from identity? That person’s personal data is associated with that phone and is still going to be part of the contract.
David Austin: It is known by the mobile network operator, but beyond that it does not need to be known at all.
Q And is that the only form of age verification that you have so far looked into?
David Austin: The only form of age verification that we, as the BBFC, have experience of is age verification on mobile phones, but there are other methods and there are new methods coming on line. The Digital Policy Alliance, which I believe had a meeting here yesterday to demonstrate new types of age verification, is working on a number of initiatives.
Q May I say what great comfort it is to know that the BBFC will be involved in the regulatory role? It suggests that this will move in the right direction. We all feel very strongly that the Bill is a brilliant step in the right direction: things that were considered inconceivable four or five years ago can now be debated and legislated for.
The fundamental question for me comes down to enforcement. We know that it is difficult to enforce anything against offshore content providers; that is why in the original campaign we went for internet service providers that were British companies, for whom enforcement could work. What reassurance can you give us that enforcement, if you have the role of enforcement, could be carried out against foreign entities? Would it not be more appropriate to have a mandatory take-down regime if we found that a company was breaking British law by not asking for age verification, as defined in the Bill?
David Austin: The BBFC heads of agreement with the Government does not cover enforcement. We made clear that we would not be prepared to enforce the legislation in clauses 20 and 21 as they currently stand. Our role is focused much more on notification; we think we can use the notification process and get some quite significant results.
We would notify any commercially-operated pornographic website or app if we found them acting in contravention of the law and ask them to comply. We believe that some will and some, probably, will not, so as a second backstop we would then be able to contact and notify payment providers and ancillary service providers and request that they withdraw services from those pornographic websites. So it is a two-tier process.
We have indications from some major players in the adult industry that they want to comply—PornHub, for instance, is on record on the BBC News as having said that it is prepared to comply. But you are quite right that there will still be gaps in the regime, I imagine, after we have been through the notification process, no matter how much we can achieve that way, so the power to fine is essentially the only real power the regulator will have, whoever the regulator is for stage 4.
For UK-based websites and apps, that is fine, but it would be extremely challenging for any UK regulator to pursue foreign-based websites or apps through a foreign jurisdiction to uphold a UK law. So we suggested, in our submission of evidence to the consultation back in the spring, that ISP blocking ought to be part of the regulator’s arsenal. We think that that would be effective.
Q Yes. Given that there is a big problem that is hard to tackle and complicated, I was just trying to get a feel for how much of the problem you think, with your expertise and the Bill, we can fix.
David Austin: We can fix a great deal of the problem. We cannot fix everything. The Bill is not a panacea but it can achieve a great deal, and we believe we can achieve a great deal working as the regulator for stages 1 to 3.
Q My question follows on neatly from that. While I am sure that the regulation will tackle those top 50 sites, it obviously comes nowhere near tackling the problems that Mr Wardle outlined, and the crimes, such as grooming, that can flow from those problems. There was a lot of discussion on Second Reading about peer-to-peer and social media sites that you have called “ancillary”. No regulation in the world is going to stop that. Surely, the most important way to tackle that is compulsory sex education at school.
Alan Wardle: Yes. In terms of online safety, a whole range of things are needed and a whole lot of players. This will help the problem. We would agree and want to work with BBFC about a proportionality test and identifying where the biggest risks are to children, and for that to be developing. That is not the only solution.
Yes, we believe that statutory personal, social and health education and sexual relationships education is an important part of that. Giving parents the skills and understanding of how to keep their children safe is also really important. But there is a role for industry. Any time I have a conversation with an MP or parliamentarian about this and they have a child in their lives—whether their own, or nieces or nephews—we quickly come to the point that it is a bit of a nightmare. They say, “We try our best to keep our children safe but there is so much, we don’t know who they are speaking to” and all the rest of it.
How do we ensure that when children are online they are as safe as they are when offline? Of course, things happen in the real world as well and no solution is going to be perfect. Just as, in terms of content, we would not let a seven-year-old walk into the multiplex and say, “Here is ‘Finding Nemo’ over here and here is hard core porn—off you go.”
We need to build those protections in online so we know what children are seeing and to whom they speaking and also skilling up children themselves through school and helping parents. But we believe the industry has an important part to play in Government, in terms of regulating and ensuring that spaces where children are online are as safe as they can be.
Q To follow on from the Minister’s question, you feel you are able to tackle roughly the top 50 most visited sites. Is there a danger that you then replace those with the next top 50 that are perhaps less regulated and less co-operative? How might we deal with that particular problem, if it exists?
David Austin: When I said “the top 50”, I was talking in terms of the statistics showing that 70% of people go to the top 50. We would start with the top 50 and work our way through those, but we would not stop there. We would look to get new data every quarter, for example. As you say, sites will come in and out of popularity. We will keep up to date and focus on those most popular sites for children.
We would also create something that we have, again, done with the mobile operators. We would create an ability for members of the public—a parent, for example—to contact us about a particular website if that is concerning them. If an organisation such as the NSPCC is getting information about a particular website or app that is causing problems in terms of under-age access, we would take a look at that as well. In creating this proportionality test what we must not do is be as explicit as to say that we will look only at the top 50.
First, that is not what we would do. Secondly, we do not want anyone to think, “Okay, we don’t need to worry about the regulator because we are not on their radar screen.” It is very important to keep up to date with what are the most popular sites and, therefore, the most effective in dealing with under-age regulation, dealing with complaints from members of the public and organisations such as the NSPCC.
Alan Wardle: I think that is why the enforcement part is so important as well, so that people know that if they do not put these mechanisms in place there will be fines and enforcement notices, the flow of money will be stopped and, crucially, there is that backstop power to block if they do not operate as we think they should in this country. The enforcement mechanisms are really important to ensure that the BBFC can do their job properly and people are not just slipping from one place to the next.
Q We have roughly 45 minutes for this group of witnesses, if necessary. Will the witnesses please introduce themselves?
Dr Whitley: My name is Dr Edgar Whitley. I am an academic at the London School of Economics. Of particular importance for this session is the fact that I am the co-chair of the privacy and consumer advisory group of the Government Digital Service.
Scott Coates: Good afternoon. My name is Scott Coates and I am the CEO of the Wireless Infrastructure Group, an independent British wireless infrastructure company that builds and operates communication towers and fibre networks.
Q In your written evidence, Mr Coates, you talked about the need for greater diversity in the ownership of mobile infrastructure. Does the Bill go far enough on that?
Scott Coates: We welcome the measures in the Bill to improve the speed at which infrastructure can be deployed and to improve the economics of deploying the infrastructure. It is critical to understand that there are different ways of deploying infrastructure. There are different ownership models, for which the Bill could have different impacts. When I say “infrastructure”, I mean the kind of mobile and fixed infrastructure that you see in the field, whether that is cables, ducts, cabinets or communication tower facilities.
There are two different types of owners of those types of infrastructure. First, the vertically integrated players are effectively building and operating that infrastructure for their own networks, primarily, and their business case is based on their economic use of that infrastructure. Secondly, you have a growing pool of independent infrastructure companies, of which we are one. We are very different from the traditional, vertically integrated players in that we are investing in infrastructure not for our own network, but to provide access, on a shared basis, to all other networks.
Q What are the current proportions for ownership?
Scott Coates: If I talk about mobile infrastructure, around a third of the UK’s communications towers—of which we think there are around 27,000 in the UK—are independently operated. It is really interesting that, globally, there has been a very firm shift over the past decade towards more independent operation of such upstream digital infrastructure.
Currently, more than 60% of all communication towers globally are held in an entity separate from the networks that use them. In countries such as India or the US, that figure is somewhere between 80% and 90%. There are real benefits that flow from the independent ownership of infrastructure. We are trying to do more in the UK, but the UK currently lags behind in the global statistics I mentioned.
Q Does the Bill do anything to address that?
Scott Coates: One of the things that we acknowledge and welcome in the Bill is that it is very clear about maintaining investment incentives—not just for the vertically integrated players, but for the independent infrastructure players such as ourselves—
Q It will not do anything to address the proportion, will it? It will only entrench the division already there.
Scott Coates: I do not think that the Bill does anything to encourage more independent infrastructure. The Government’s policy position at the moment is very clear: they want to maintain investment incentives for independent infrastructure. To achieve clarity on this requires that the Bill is worded very carefully.
When we deploy our tariff facilities and infrastructure on or adjacent to land, as things are now one of the definitions of UK land often covers things that sit on that land. One of the potential risks is that if the activities we engage in and the facilities that we deploy are not carefully carved out, they risk being treated as land. Under the new valuations principles in the communications code, that potentially risks giving them no value or low value, which would obviously be devastating to investment appetite. The consequence of that would be further concentration of infrastructure ownership in the hands of the larger, vertically integrated players who have different incentives from us when they approach this.
Q So there is potential for this to get worse, but what could be done to actually encourage more independently owned infrastructure?
Scott Coates: We would like to see a carve-out that is as clear as possible for the activities that we are engaged in. We would like to see it made absolutely clear that the communications code, which is a compulsory purchase tool to bring land into the telecoms sector, does not drift beyond that focus and risk entering into what is really Ofcom’s territory, which is to govern the relationships between telecoms companies.
Q Dr Whitley, if I may jump to part 5 of the Bill, we heard earlier that there were concerns that the Government have not taken sufficiently into account safeguards around privacy and personal data. Do you think that this strikes the right balance between open policy-making and privacy?
Dr Whitley: My main concern with part 5 is that the detail is just not there. The codes of practice that one would expect to have there, which would give the details about how privacy might be protected, are not present. We have been involved with the privacy and consumer advisory group. As far as I can tell, we had our first meeting with the team who were developing these proposals back in July 2013. We said from the very beginning that we want detail, because when we have specific details we can give advice and suggestions and review it, but we have never had that level of specific detail.
Q So the proposals do not reflect at all the three years of consultation that have taken place?
Dr Whitley: Obviously, that is reflected in some parts of the proposals, but we asked for more details specifically on how privacy will be protected regarding the data-sharing proposals, and that is still not there.
Q Should that detail be in primary legislation?
Dr Whitley: Whether it is in primary legislation or in codes of practice, my personal view is that you need a certain level of detail to be able to make an informed decision. Otherwise there will be some vague position of, “We will share some data with other people within Government. Trust us, because we are going to develop some codes of practice that will be consulted on and will then be put in front of Parliament. There will be protections and it will all be fine”. We are saying that there are lots of different ways of doing that. The earlier you give us at least a first attempt at those details, the better we can improve it.
Q In that period of consultation, was the detail around transparency never discussed?
Dr Whitley: It depends. There has been talk along the lines of there being codes of practice and liaison with the Information Commissioner’s Office, so at a very high level there has obviously been some discussion. But at the very specific level—for example, the civil registration clauses talk both about allowing a yes/no check around whether there is a birth certificate associated with a family, while on the other hand there will be bulk data sharing within Government so that different Departments can know stuff and possibly make things better for society.
One half of that seems to be quite specific, and you can see how it could well be designed as a simple “Does a birth certificate exist for this person?” and the answer is yes or no. The privacy protections around that are reasonably well known and not very much data is being shared. Then the other illustration just says, “we will share these data with other bits of Government” and there is nothing there about what kind of privacy protections might be put in place. There are many different ways in which that can be done, but until we have some specific details, we cannot give you sensible reviews as to whether that is a good or not so good way of doing it.
Q Mr Coates, what role should wireless technologies play in achieving the universal service obligation?
Scott Coates: There is no doubt that for the last 5%, maybe a greater proportion than that, wireless technologies have a significant role to play. Six of the seven trials run by the Department for Culture, Media and Sport earlier this year were of a wireless-based structure. I think there is a role for it. It is also interesting, as you look beyond 10 megabits to the future when universal service means something far more substantial than that, that a new disruptive technology is coming.
Everyone is talking about 5G; it does not really exist at this stage, but we know it is going to be ultra-high bandwidth, ultra-low latency, with the potential to be a disruptive technology and replace fixed line to the home. Some countries around the world that have not had the wave of fixed line technology roll-out will be moving straight to wireless as their domestic broadband service.
Q Once we work that out, which I am confident we will, where are the opportunities? Where is the up side? Where is the positive stuff coming out of this? How can Government be better as a result of this? I am always an optimist.
Dr Whitley: Done right, there are fantastic opportunities. Government is digitising. The GDS has got lots of experience about how to manage and handle and do attributes checking, which is what most of this is. There are definitely opportunities and the skills, but somehow something has gone wrong with regard to these proposals.
It is not as if the proposals have been rushed through in the past few minutes. We have been looking at these and asking for more details since July 2013 and we are still here without even a resemblance of a code of practice. Part 5 has six codes of practice that need to be developed and none of them is here. Yes, please, but some detail. I am academic; I want to see the detail.
Q As you say, it is an enormous shift in terms of data sharing within Government. Clause 29 would allow personal data on citizens to be shared if there is a
“contribution made by them to society”
or wellbeing to be gained. That basically covers anything, doesn’t it? Why have the Government not produced even a draft code of practice at this stage? How can we possibly be expected to vote on this while plainly placing blind faith in the Government?
Dr Whitley: You are basically saying what I was going to say. If you compare the comprehensive replies that Mr Coates has been able to give, talking about very specific details, with the vague “we don’t know anything” comments that I have made, you see that it is a real problem and also an issue for more general scrutiny of technological issues. If you do not have details about the different mobile phone frequencies that you are talking about, you cannot make detailed policy. Yet when it comes to data sharing, there is a sense that it will all work out in the end because we have the right people to do it.
Q How would you advise the Government to achieve that code of practice?
Dr Whitley: We have consistently said—the Privacy and Consumer Advisory Group particularly, because we have this existing relationship with Government, but civil society and experts more generally—that we are more than happy to engage. We have repeatedly said, “Give us some detail. Don’t just come and talk about high-level stuff. Give us the detail and we will give you detailed comments to improve the process.”
That has worked very well in relation to the Verify scheme; that is privacy friendly and has a lot of support from the kinds of people who are very concerned about privacy. So the expertise is there and the working relationships are there. Give us an opportunity to help; we want to. It is just that we need something to work on.
Thank you very much to Mr Coates and Dr Whitley for some excellent evidence. We are very grateful. We will now move on to our next set of witnesses.
Examination of Witnesses
Jim Killock and Renate Samson gave evidence.
Q I will pick up where we left off, if that is okay. You were both involved in the consultation process for part 5 of the Bill. Did the proposals come as a surprise to you? Do they make sense to you as data experts?
Renate Samson: No, they do not make very much sense, if I am honest. As I said, we were a member of the open policy making process and we also submitted to the consultation. I am genuinely surprised that after a two-year process, all of a sudden it felt very rushed. There were conversations and meetings happening right up to the Queen’s Speech; there was still a general lack of clarity, particularly on safeguards, and many questions were still being asked, such as how, why, when and so on. The next thing we knew, it was in the Queen’s Speech and the Bill was published.
Reading through part 5—and I have read through it a lot and scratched my head a great deal, mainly for the reasons given in evidence earlier today—you see that the codes of practice, which would explain an awful lot of what we imagine is meant or may not be meant, just have not been published. I have repeatedly asked for them and been given various expected dates, and we are sitting here today without them but with the Bill already having been laid before Parliament.
We have also done a lot of work on the Investigatory Powers Bill, for which the codes of practice were there right from the start. There was clarity as to what was intended and what was going to be legislated for, straight up. So, I am profoundly disappointed, because data sharing and digital government are hugely important and we seem to be very far away after a very long process.
Jim Killock: It is worth considering why the open policy making process was put in place. Data sharing is known to be potentially controversial. It was knocked out of at least one previous Bill a few years back when proposed by Labour because of the lack of privacy safeguards. Everyone understood that something more solid was needed. Then the Cabinet Office was very keen to ensure it did not raise hackles, that it got the privacy and the safeguards right, that trust was in place. It was therefore a surprise, after that intense process, to get something back that lacked the safeguards everybody had been saying were needed.
We are particularly concerned not only about the lack of codes of practice, but the fact that a lot of these things should be in the Bill. Codes of practice are going to develop over years. We need to know about things like sunsetting, for instance—that these things are brought to a close, that you do not just have zombie data sharing arrangements in place, where everyone has half-forgotten about them and then suddenly they are revived. You need to have Parliament involved in the specifics.
As we have heard, data sharing has a huge range of possibilities, starting with the benign and the relatively uncontroversial: statistics and understanding what is happening to society and Government policy, where privacy is relatively easy to protect. You use the data once, you do the research and that is it. It ranges from that through to the very intrusive: profiling families for particular policy goals might be legitimate, but it also might be highly discriminatory. Getting to the specifics is important.
You need the safeguards in place to say, “These are the kinds of things we will be bringing back; these are the purposes that we may or may not share data for.” That way, you know there is a process in place. At the moment, it feels like once this has passed, the gate is opened and it is not necessarily for Parliament to scrutinise further.
Q We talked earlier about the bulk transfer and bulk sharing of data, and an earlier witness talked about providing data access, rather than data sharing. Should the Government not be pursuing trials on that basis, rather than these enormous powers without any kind of assurances to the public or parliamentarians about how they will be using them?
Renate Samson: It was very specific at the end of the open policy making process that, for example—put the bulk to one side for a moment—but regarding the fraud and debt aspect of the Bill, it had been agreed that three-year pilot projects would take place with subsequent review and scrutiny potentially by the OPM or by another group. They are in the Bill as a piece of legislation with the Minister deciding whether or not it is okay and potentially asking other groups, which are not defined. That is half an answer to half your question. Pilots are an excellent idea if they are pilots, not immediate legislation.
With regards to the bulk powers in the Bill, civil registration documents were a late addition. We are still not clear as to their purpose. The purpose given in the consultation to the OPM process, but also in the background documents relating to the Bill, is a whole mix of different reasons, none of which, I would argue, are clear and compelling or, indeed, necessary and proportionate. But again, as you have heard a lot today, without detail, how can we properly answer your question?
Jim Killock: I have a quick observation on this. We currently have a data protection framework. The European Union is revising its data protection laws; they are somewhat tougher, which is quite a good thing, but we do not know what the future of data protection legislation is in the UK. It might be the same or it might be entirely different in a few years’ time.
That is a very good reason for ensuring that privacy safeguards are quite specific and quite high in some of these sensitive areas, because we do not know whether the more general rules can be relied on and whether they are going to be the same. That is not to say that we do not need higher safeguards in any case here, because you are not dealing with a consent regime. People have to use Government and Government have to look at the data, so it is not a mutual agreement between people; you have to have higher safeguards around that.
Q My questions are directed at Mr Killock and relate to paragraphs 37 and 38 of your submission, “Definition of pornographic material”. We heard earlier that both the NSPCC and the British Board of Film Classification support a provision to require ISPs to block websites that are non-compliant. There was also discussion of widening the scope to apply the restrictions to other harmful material that we would not allow children access to in the offline world. Here, you seem to be questioning the value of that:
“This extension of the definition…also raises questions as to why violent—but not sexual—materials rated as 18 should then be accessible online.”
I also question this consistency but the solution, to me, seems to be that we should include other material, such as violent material and pro-anorexic websites, as we talked about earlier. Will you tell us a bit more about what your objection is to creating a framework to keep children as safe online as they are offline?
Jim Killock: We have no objection; it is a laudable aim and something we should all be trying to do. The question is, what is effective and what will work and not impinge on people’s general rights? As soon as you look a little beyond pornography, you are talking about much more clear speech issues.
There will be a need to look at any given website and make a judgment about whether it should or should not be legally accessed by various people. That starts needing things like legal processes to be valid. Some of the things you are talking about are things that might not be viewed by anybody, potentially. The problem with all these systems is that they just do not work like that. They are working on bulk numbers of websites, potentially tens of thousands, all automatically identified, as a general rule, when people are trying to restrict this information. That poses a lot of problems.
I also query what is the measure of success here. Because I feel, I suspect, that the number of teenagers accessing pornography will probably not be greatly affected by these measures. There is more of an argument that small numbers of children who are, perhaps, under 12 may be less likely to stumble on pornographic material, but I doubt that the number of teenage boys, for instance, accessing pornographic material will be materially changed. If that is the case, what is the measure of success here? What harm is really being reduced? I just feel that, probably, these are rather expensive and difficult policies which are likely to have impacts on adults. People are saying it is not likely to affect them, but I rather suspect it might, and for what gain?
Q Paul, the Government have delayed by a year outlining their digital strategy. Could you give the Ministers a hand here? What would you like to see in a digital industrial strategy?
Paul Nowak: There are a number of points in the Bill where we think there are positive steps forward: things like the universal service obligation. I am happy to talk about some of those points. The missed opportunity for us is really getting a handle on what the emerging digital economy means for working people. Tomorrow, we will have the outcome of the court decision on Uber. That is just one example of where changing technology potentially affects working people’s lives. We believe there should be a proper framework and employment law should properly reflect the change in the world of work. The point was made by a number of MPs on Second Reading that the Bill missed a trick in terms of that new framework of rights and responsibilities for people who work.
Q What would that framework look like?
Paul Nowak: It would tackle issues around, for example, employment status. We have this curious interface between the new, emerging digital economy and what I would characterise as some old-fashioned exploitative employment practices. It is great that we can all order new goods and services online via eBay, but often the person who delivers that package will be working so-called to an app and they will be so-called self-employed, driving their own vehicle and with no rights to paid holidays, maternity or paternity leave and so on.
So a framework of laws that is fit for the digital age. It is welcome that the Government have announced that Matthew Taylor will be looking at some of these issues, but I would have thought that for a Digital Economy Bill there is a gap in the Bill itself.
Q Has the TUC been consulted on that by the Government?
Paul Nowak: We have had no engagement in terms of the process I described with Matthew Taylor and, as far as I am aware, we have had no input in terms of the Bill and the thinking around what a decent framework of employment rights will look like to respond to that emerging digital economy.
Q What about the digital skills gap—where could the Bill go further there?
Paul Nowak: That is not something that we have looked at particularly, but I think it goes without saying that the need for digital skills will go well beyond those core digital industries. The proof of the pudding will be in the eating. We are pleased that the Government are now talking about industrial strategy, and we think that the digital economy should play a key role at the heart of that industrial strategy. It is not just about digital industries themselves; it is about how those digital industries can support jobs in our manufacturing, engineering and creative industries, but you need to make sure that people have the skills—not just at one moment in time, but ongoing skills throughout their working lives—to enable them to adapt to the changing world of work. For example, one of the things that we have pushed heavily through our Unionlearn arm is equipping people with those skills, but making the case that people should have access to careers advice and guidance all the way through their working lives rather than just at the point at which they leave school, college or university.
Q Sarah and Chris, I do not know whether you were here for the earlier sessions, but we have heard quite a few concerns about the data-sharing proposals in part 5 of the Bill. Do you share the concerns about the lack of privacy safeguards in those proposals?
Sarah Gold: I do. There are quite a few pieces of information missing that I would like to see in the Bill to protect individuals’ privacy. I think I heard Jeni Tennison talk earlier about openness and transparency, and I agree with her that one of the major pieces that is missing from the Bill is transparency about how people’s information will be used.
For me, this is also a missed opportunity to talk about consent, which is increasingly becoming a design issue, not necessarily just one of policy. That means making sure that there are steps in place to ensure that people understand how their data will be used, by whom, for how long and for what purpose. That is really important, because currently, the only models of consent we seem to default to are terms and conditions, and I have to ask the Committee: when was the last time any of you read or understood a set of terms and conditions?
Q Claire Perry brought up the poor standards in the private sector earlier. Presumably you agree that the Bill misses an opportunity to deal with consent for the private sector’s use of data as well.
Sarah Gold: It does, because I think the Government should set best standards on this. There is a real opportunity to do that, and I cannot see that on the face of the Bill.
Chris Taggart: I broadly agree. There was a comment in one of the submissions that despite this being a Digital Economy Bill, it felt like it was from almost 10 years ago. We have the ability to treat data in a much more granular way—dealing with permissions, rights and so on; having things selectively anonymised; having things almost time-boxed, and so on. It struck me that it felt like the Bill was using the broad brush of how we used to exchange data 10 years ago. That seemed like a missed opportunity, particularly given that what we are talking about here is Government to Government. While it is very difficult for the private sector—or even between the Government and the private sector—to come up with some of those solutions, when you are talking essentially about one organisation, particularly one where there is the ability to legislate that everything should happen in the right way, it seems to be a missed opportunity.
I was asked a couple of years ago to be on the Tax Transparency Sector Board, which talked about opening up some of the tax data. Of course, pretty much no data were actually opened up, but some of the discussions were interesting. For example, the Bill talks a lot about individuals, which is absolutely right—I believe that we have innate human rights—but from a tax point of view, individuals and companies are exactly the same thing. There is no difference. HMRC was saying, “Hey, look, whatever we think and whatever we would like to do, we have no ability to treat individuals and companies as the same.” The idea of allowing companies to tick a box and say, “Yes, we’d like our tax to be reported and to be open about it,” or saying, “These offenders will be treated differently if they are corporate offenders,” for example—many countries do report tax offences by companies—was not even possible because of the underlying legislation. There is a sense that that sort of attitude slightly pervades some of this. Again, I am extremely in favour of the Government being more effective and efficient and using information sharing for that, but I would like the Bill to be as good as it possibly can be.
Finally, there are little things—I used to be a journalist but now I am a full-time geek—such as what is being reported? What things have been shared? How are those organisations being identified? The Government do not even have a coherent way of identifying Government Departments or non-departmental public bodies. Those sorts of things. There is a lot more that could be done to make this a genuinely effective Bill.
Q Mr Taggart, you mentioned something about its feeling like it is 10 years out of date. I want to bring us bang up to date by chucking in a Brexit question. Is there anything that the three of you could very quickly add to the discussion about what might need to be in the Bill given that we are now in Brexit? Brexit has implications for the digital economy, about which I am sure you know more than me.
Chris Taggart: I will try to be brief. One is to do with policy aspects of what happens. I believe you are hearing from the Information Commissioner later. What happens to data protection in a post-EU UK? From our perspective, the UK has generally taken a slightly different perspective on data protection from the information commissioners in some other countries and is generally taking things like public interest into account and treating paid-for and free information the same, which we welcome. We have some concerns about the general data protection regulations because of that sort of stuff and some of the stuff that is coming from the EU. There are some potential benefits, but there are also some downsides about whether people’s rights will be defended. I think the digital economy becomes much, much more important, and my position here is as an advocate of open data and the potential for open data in driving a thriving digital economy. As a digital entrepreneur, I think we are missing some significant opportunities for that. If you were to sit down today and do a digital economy Bill with the knowledge that in a couple of years we perhaps would not be part of the EU, I think we would be doing something quite different.
Paul Nowak: May I pick up the point about post-Brexit? I think there is growing political consensus that one of the implications of the decision on 23 June is that we need to think seriously about how we invest in our national infrastructure. For the TUC that goes beyond Heathrow, Hinkley, High Speed Rail. It talks to issues around, for example, high-speed broadband. It is about thinking about how this Bill would interface with, for example, announcements that might come in the autumn statement about investment in high-speed broadband. I note that the Chair of the Committee talked about the interface between rail and high-speed broadband, which is something that should be borne in mind. Again, valid points were made on Second Reading about requirements for developers to incorporate high-speed broadband into new housing developments, which is absolutely essential. I reiterate the point I made earlier about seeing this in the context of the wider approach to industrial strategy and how the digital economy can support other parts of the economy that are going to be even more important as we move forward post-Brexit.
Sarah Gold: For me, particularly looking at privacy, security and personal data, it is about the age of some of the language used in the Bill. Even talking about data sharing feels to me like the wrong language. We should be talking about data access. Data sharing suggests duplication of databases, with data being slopped around different Departments, whereas data access suggests accessing minimum data via APIs or by using the canonical Government registers, which is an excellent project that is not mentioned in the Bill but should be.
Q Thinking about algorithms beyond the workplace, we know that Uber, for example, will charge more if your battery is low. Having worked for an insurer before I was elected, I know that the amount of data that is available to insurers to set prices would make your hair curl. How much transparency should there be around the algorithms that companies use to set prices, while protecting the intellectual property of those algorithms?
Chris Taggart: That is a fantastic question, and it comes to the heart of our ability to understand our world and influence it. I take quite strong, almost like democratic first principles with this: you need to be able to understand the world and have the ability to understand the world, and then to be able to influence it. That is what democracy is about. If we do not understand the world—if we do not understand that we are being given this particular news story in this particular way; that we are being given this particular price; that we are being influenced to walk down this street rather than that street in order to do this—then we really do not have that possibility. A question that is not asked often enough but that is starting to be asked more in academic circles is: what are the algorithms on which our lives depend? If we do not understand that we are being driven by algorithms, still less what those algorithms are, how do we have agency? How do we have free will, if you like? I think it is a really important question.
I think that increasingly we will see that we need transparency around that, and that with transparency there is always the ability for there to be negative downsides. You could argue that, by having courts open, people can just walk in off the street and see that this person over there is being prosecuted; some neighbour, or whatever. But if we are not starting to ask those sorts of questions and staring to come up with some informed answers, we will be in a world where we have lost the ability to ask those sorts of questions.
Paul Nowak: I am not particularly well versed in this area, but I suppose that it is a little bit like the terms and conditions question. You could provide so much transparency that it would give the illusion of people being informed, and I think what you want to do is to allow people to understand what are the potential implications of those algorithms. So, if you are using Uber you know that if there is a spike in demand or a lack of supply, you are likely to pay more, and what the implications of that might be, and what the parameters of that are. I do not think that means that Uber needs to make all of its software open source—frankly, that would mean nothing to me—but I want to know when I get in what the fair contractual exchange is between me and the company that is providing the service.
Sarah Gold: I am very well versed in this area but I have very little time to talk about it, which is very frustrating. However, I think that looking at how individuals can question algorithms is very important; I agree with both of your comments. Particularly in GDPR, there is a clear piece that is about people being able to question automated decisions that are made about them.
As a design problem, that is really fascinating. For instance, if you think about when you buy flights on browsers, I think that everyone has probably seen that when you go back to book the flight again, your IP address has been tracked, you are a cookie, and so you see the same flight booked for—it costs you more. So you go into kind of incognito mode to check that.
What I am quite interested in at the moment is that sort of incognito testing of algorithms, so that you can see how your inputs might change an output. In the context of Uber and insurance, I am very interested in this emergence of insurance for, say, a single day of driving or for a particular route, and being insured—say, it costs you far more to go down the M1 than just the A1. And you should be able to understand why that decision has been made about you, because it has a significant consequence for your life.
However, that comes down to the quality of the training data, too, and that comes back to some of the terms of the Bill—we should be working towards greater data minimisation, I think, and also the ability for people to be able to audit not only those data, to correct those when they go wrong, but to provide an audit of data access. While it may not mean everything to all of us, because not all of us are developers, I think that for those individuals who are able to scrutinise the code and check for digital rights management or security vulnerabilities, or biases in data sets, that information is really crucial, because it is those individuals who are our greatest defence against data misuse or fraud.
Thank you very much indeed; that is a high note on which to conclude. I thank our three witnesses for your evidence. We may now release you and we will call our final two witnesses for the afternoon to come forward.
Examination of Witnesses
Professor Sir Charles Bean and Hetan Shah gave evidence.
Welcome to our two final witnesses today; I am sure you will keep us on our toes in our final session. Could you please introduce yourselves for the record?
Hetan Shah: I am Hetan Shah, Executive Director of the Royal Statistical Society.
Professor Sir Charles Bean: Charlie Bean, London School of Economics and soon to be Office for Budget Responsibility.
Q We have heard from witnesses today about a lot of the negatives and potential pitfalls of data sharing across Government. I have nothing against the Government’s intentions here, but do you share the concerns of previous witnesses about the lack of safeguards for privacy in part 5 of the Bill?
Professor Sir Charles Bean: You will have to excuse me; since I was not here for your earlier discussions, I am obviously not aware of what earlier witnesses have said and what their reservations are. My interest obviously is in the use of the information for statistical purposes. It is important that there is a clear and well understood framework that governs that, and there clearly need to be limitations around it.
I have to say that I think the current version of the Bill strikes a reasonably sensible balance, but there are bits that will clearly need to be filled in. The Office for National Statistics will need to spell out a set of principles that govern the way it will access administrative data, and so forth.
Q Do you think there is any framework in part 5 around the sharing of data?
Professor Sir Charles Bean: Sorry—
You said you are satisfied that it strikes the right balance. Do you believe there is any framework in terms of the principles for data sharing in part 5?
Professor Sir Charles Bean: By “appropriate balance”, I mean in terms of the statistical authority having in-principle access to the administrative data that it needs to do its work, subject to certain limitations.
Q Do you believe there should be transparency for—
Professor Sir Charles Bean: I certainly believe in transparency. I am a big fan of transparency. Anyone who has worked at the Bank of England would like transparency.
Hetan Shah: May I come in and build on this? Privacy is absolutely critical to maintaining public trust, and in a sense we think the Bill has missed a trick here. On the research side, the framework is embedded on the face of the Bill. In our view, the ONS has a very good track record—it has maintained 200 years of census data, it has the best transparency, it publishes all the usage of the data and it has already criminalised the proceedings of misuse of data—but that has not been put on the face of the Bill. A tremendous amount could be done to reassure by taking what is already good practice and putting it on the face of the Bill, and I think that will answer the issue for the statistics and research purposes.
Q My full question was not, “Do you believe in transparency?” It was going to be: do you believe in transparency in terms of how citizens’ data will be shared with the Government and between Government agencies? That principle, as you say, is not only not on the face of the Bill but not anywhere in the Bill. We have been asked by the Government to rely on codes of practice that have not even been drafted yet.
Professor Sir Charles Bean: I agree that transparency about the principles that will govern sharing of information makes a lot of sense.
Q As you say, Mr Shah, for Government data sharing to work requires public trust, and digital government and the use of your statistics absolutely requires trust that the Government will handle data with due purpose and cause.
Hetan Shah: Another thing is that the UK Statistics Authority is directly accountable to Parliament, not the Government. That actually makes the statistics and research strand more accountable compared with other parts of the Bill. I remind you of that, which is very important.
Q I would be interested if you could explain and put on the record some of the consequences you see of having this Bill and the underlying secondary legislation on the statute book. What impact will that have on the areas in which you are experts?
Professor Sir Charles Bean: The key thing is that it greatly improves the gateways that enable the Office for National Statistics to use administrative data—tax data and the like—in the construction of official economic statistics. We are well off the pace compared with many other countries. Scandinavian countries, Canada, the Irish and the Dutch make very heavy reliance on administrative data and only use surveys to fill in the gaps. Here, the Office for National Statistics is essentially an organisation that turns the handle, sending out 1.5 million paper forms a year and processing those. Essentially, you are acquiring the same information again that you have already got in some other part of the public sector, where the information is being collected for other purposes.
The key gains here I see as twofold. First, because you access something close to the universe of the sample population rather than just a subset, which would normally be the case with a survey, you potentially get more accurate information. It is potentially also more timely, which for economic policy purposes is important.
The other side of the coin is that by enabling you to cut back on the number of surveys you do, there is a cost gain, which I should say would probably not mainly be a gain to the ONS, because they have to do the processing of the administrative data, but a gain to the businesses and households who are currently spending time filling in forms that they would not need to do if more use was made of administrative data.
Q Mr Shah, you keep mentioning access to data, but the problem we heard earlier is that the Bill talks not about access to data but about data sharing, which implies duplication. We should really be moving towards data minimisation. Do you think that the language of the Bill should reflect access to data, rather than data sharing?
Hetan Shah: My view is that for the clauses on statistics and research the Bill is pretty clear that it is about data access.
Q It discusses the transfer of data. It does not talk about your accessing data. It does not mention the technology through which you would do it. There are no codes of practice alongside how it would happen. It is very broad and explicitly talks about data sharing in certain areas.
Hetan Shah: I think I said this earlier, but in case I was not clear I shall repeat it. For statistical and research purposes, statisticians and researchers are interested only in aggregates; they are not interested in us as individuals. It is a key point that the relevant clauses are quite different from some of the other parts of the Bill. Others have indicated in their evidence that this area should be seen as slightly different.
It is also worth noting that there are safeguards that have been tried and tested over many years. There is the security surrounding the data—the ONS will not even let me into the vault where they hold the data. You need to be accredited and to sign something saying that you will not misuse the data. If you do, you will go to jail. The trick that has been missed has been not saying all that, because it is almost assumed that that is how the ONS works. My suggestion is that if you want to strengthen that part of the Bill, you should just lay out the safeguards that are already common practice in the ONS.
Q Thank you both for setting out some very factual and helpful arguments as to why the provisions are a good thing, particularly when it comes to aggregate statistics. I was struck by a quote in your report published in March, Professor Sir Charles. You mentioned the
“cumbersome nature of the present legal framework”,
which the Bill will clearly help to solve, and you also said that there was a
“cultural reluctance on the part of some departments and officials to data sharing”
and, in many ways, to working together, as we know from experience. How do we solve that problem and get Departments to realise how helpful some of these datasets might be?
Professor Sir Charles Bean: A key thing about the Bill is that it shifts the onus of presumption. There is a presumption of access unless there is a good reason not to comply or explain, if you like, as opposed to the current arrangement, which is that the data owner has the data and you say, “Can you please let us have a look at it?” There is civil service caution. I was a civil servant very early on in my career, so I am aware of how civil servants think. Inevitably, you are always worried about something going wrong or being misused or whatever. That plays into this, as well.
In the review I said there are really three elements and I think they are mutually reinforcing. There is the current legal framework, which is not as conducive as it could be; there is this innate caution on the part of some civil service Departments, or even perhaps on the part of their Ministers on occasion; and then the ONS has not been as pushy as it might have been. It is partly that if you know it is very difficult to get in—people are not very co-operative at the other end and the legal frameworks are very cumbersome—you are less inclined to put the effort in, and you think, “Oh, well, let’s just use the surveys, as we’ve always done.” So I think you need to act on the three things together, but they are potentially mutually reinforcing if you get the change right.
Hetan Shah: This is one area where I think the Bill could be strengthened. At the moment, the ONS has the right to request data; similarly, the researchers have the right to request data. The Department can still say, “No”, and in a sense the only comeback is that there is a sort of name-and-shame element of, “Parliament will note this”, as it were. My worry, given the cultural problems that have been seen in the past, is that that may not be enough. So why do we not do what Canada does? It just says, “The ONS requests”, and the Department gives.