(1 year, 10 months ago)
Lords ChamberMy Lords, they say that there is no such thing as a free lunch. When it comes to the social media companies, that is certainly true. Google Search is free, as are Facebook, Twitter, Instagram, WhatsApp, YouTube, TikTok and a host of other online services. All of them are great products, hugely popular and used by billions of people every day throughout the world. So it begs the question: why are they free? It is because the mass of data that the internet companies hoover up on their billions of users is a treasure trove. They collect data such as location, shopping, searches, medical records, employment, hobbies and opinions. It is said that Google alone has more than 7,000 data points on each one of us. In our innocence, we all thought that we were searching Google; little did we realise that Google was searching us.
What do they do with this hoard of data? They synthesise it through algorithms. They sell their results to advertisers. Traditionally advertisers spend huge amounts on newspapers, television and other media, struggling to target their markets. It was imprecise. Today, using the data provided by the social media companies, advertisers can personalise their message and pinpoint it accurately. It is hugely cost effective and it generates hundreds of billions in revenue. Data truly has become the new oil.
Of the five largest companies in the world by market value, four are big tech: Apple, Microsoft, Alphabet/Google and Amazon. Indeed, Apple alone has a market value equal to the combined value of all the companies on the FTSE 100 Index. Big tech is bigger than most countries. The big tech companies are richer than us, they move faster than we do, they are aggressive, they are litigious, they are accountable to no-one, they have enormous power, and they make their own rules. They employ the smartest people in the world, even including a previous Deputy Prime Minister of our country.
The Zuckerberg shilling can buy a lot of influence. Let us take a look at Facebook. Its platform has allowed the most unspeakable acts of violence, hate and perversion to go viral, pretty much unchecked. It says that it moderates content, but it is not enough, and usually too late. Now we learn that Mr Zuckerberg is spending $10 billion a year on developing his metaverse. Already we have read of examples of virtual reality sex orgies, and participation in gruesome violence, all viewed through a Meta headset, where the avatars are quasi-people and it becomes almost impossible to distinguish reality from fiction. Imagine where that is all going. Frances Haugen, the Facebook whistleblower, had it right when she said that only profit motivates the company.
This is a landmark Bill. We have to get it right, and we have to make it tough, with no room for loopholes or ambiguities. I have tried to paint a picture of the participants. I have worked and been involved in the digital industry for over 50 years. I know the nature of the beast. They will fight to the last to preserve their business model. Do not underestimate them. These people are not our friends.
(2 years, 9 months ago)
Lords ChamberMy Lords, like most noble Lords, I am absolutely thrilled to see Lord Puttnam here today on the steps of the Throne. His wisdom and fingerprints are all over this cracking report and we owe him a great debt of thanks. Speaking very personally, I have to say that it is a real tragedy that he is no longer a Member of this House. My noble friend Lord Lipsey is to be congratulated on stepping into his shoes and delivering such a masterful introduction to this debate.
Never one to hold back, Lord Puttnam said in another speech that the Government’s response was “lamentable” and:
“It came across as if written by a robot”.
I will go a little further. On this committee sat Members of your Lordships’ House, drawn from all sides, each of whom has extensive experience of the dangers to our democracy from the misuse of digital technology—and we have heard from this debate just how powerful and experienced all the contributors are. So why did DCMS produce such a tepid and bland reply to our report, and why did it not accept many of our 45 recommendations? We know the answer: it just ignored them. Written by a robot? More likely written by a junior with the brief, “Write 25 pages and say nothing”.
I wish to contain my comments to recommendation 8:
“The Competition and Markets Authority should conduct a full market investigation into online platforms’ control over digital advertising.”
I will link that to what I believe to be the massive dangers to our democracy posed by big tech, in particular Google and Facebook; I do not have to use their other names, Alphabet and Meta.
When Google was founded, it had a corporate mantra which proclaimed, “Do no evil”. Facebook had one too. It was “Move fast and break things”. Today, Google has 3.5 billion daily searches; Facebook has 1.7 billion active users. These staggering figures show that both services are hugely popular in much of the world, and they are also free of charge. But, of course, we all know that they are not free, because their product is each one of us, and our combined data is very valuable. The amount of data that Google has on all of us is mind blowing: 10,000 petabytes, a number that is so immense I cannot even conceive of it. How does Google collect it? From our location, from our searches, from the apps we use and from what we buy and where we buy it. Facebook has 300 petabytes of information as well: a smaller number but still huge.
Both companies monetise this data by using algorithms that produce results that are vital to advertisers in selling their products. If data is the new oil, Google and Facebook and their sister companies, WhatsApp, YouTube and Instagram, are literally swimming in it. The ownership of such data gives these companies enormous power—corporate power the likes of which has never been seen before. But they have not behaved like responsible citizens. In the political environment, we have seen massive abuses of power, particularly by Facebook. The role of Cambridge Analytica in the 2016 Brexit referendum, and then its role in the 2020 presidential election in the United States, are famous examples. The data it provided, and the manner in which it obtained it, were contrary to the best aspects of democracy. Frances Haugen, the Facebook whistleblower, said of Facebook in her brave testimony to the House of Commons Select Committee:
“Unquestionably, it is making hate worse.”
Their power of these companies is awesome. Their bank accounts are huge. They are staffed by brilliant people and they hire the best advice in the world, Mr Clegg included. Plus they pay minuscule tax on their enormous global profits. From an economic point of view, both Facebook and Google are monopolies —and not just national monopolies but global ones. Google, for example, has 92% of the UK search market; YouTube’s figures are even higher. These companies engage in surveillance capitalism. They are dangerous and we need to curb their power.
Luckily, movement is afoot. In the United States, Lina Khan, head of the FTC, is pushing for antitrust Bills. In the US Senate, Amy Klobuchar has introduced a Bill. In the EU, Margrethe Vestager is introducing a digital services act to regulate big technology. But we lag behind. I ask the Minister whether the Government have any plans to encourage the CMA to investigate the monopoly powers and influence of big tech. Big tech moves fast and breaks things. Big tech also facilitates hate and evil. We, the lawmakers, are ponderous and slow to act, but the threat to our democracy is real and we need to move with haste.
(6 years, 7 months ago)
Lords ChamberAt end insert “, and do propose Amendment 53B instead of the words left out of the Bill by this Amendment and by Amendment 207”.
My Lords, the words in the Bill and the words on the screens above us summarise my position. This is the Data Protection Bill, and my amendment is solely about protecting data—our data; our data of national significance; and in particular, our data owned by our National Health Service. Who do I wish to protect it from? From the predatory big tech companies, which see a huge financial opportunity in developing this NHS data and creating data algorithms; they can then sell those for billions of pounds, leaving us with precious little in return. The very same companies, by the way, pay minuscule corporation tax in our country and, indeed, it is the same in their own country. They are clever, immensely well funded, and very focused—they run rings around the NHS.
I feel that I have to prevent this happening. I seek to set in motion a process which will keep the value of this data for the benefit of our NHS so that it can use these proceeds either to plug its growing budget deficit or fund significant critical medical research—or, indeed, both. If I let my imagination go even further, I would like to see the setting up of a sovereign health fund into which these proceeds could be channelled and administered, in the same way as the Norwegians set up a sovereign wealth fund. What they have done with the proceeds of their North Sea oil we can now do with our data bonanza. As many have said throughout the Bill’s proceedings, data is the new oil—and we have struck a gusher.
If I may be permitted to extend the analogy even further—like oil in the ground, this data is crude; it needs to be refined. Huge investment will need to be made to create a data refinery which will be able to synthesise the millions of records that will produce the algorithms. It should be seen as a national co-production, perhaps with private and public partnership.
At Second Reading, I stated that it was my judgment that the market value of NHS longitudinal data could be worth billions of pounds. In all honesty, as I progressed, I fully expected someone to disagree with me and tell me that I was wrong. But no such person has come forward. All the experts seem to confirm my position. I made the point that the longitudinal data owned by the NHS was unique, with tens of millions of patient records going back to 1948 and even earlier. No other country has access to such a treasure trove. Even better, our population is diverse, with the records of people whose family members come from all corners of the globe. We have a perfect dataset.
The reason big tech companies are so interested in this data is that with the combination of sophisticated software, ultra-fast data processing, artificial intelligence and machine learning capabilities, they are able to produce algorithms which are tremendously powerful. These can be used to predict organ abnormalities to the extent that clinicians can save time and money, and ultimately people’s lives. And who can disagree with that? It is wonderful for all mankind.
By way of an example, DeepMind, which is based in London—it is a subsidiary of Alphabet, which owns Google—has been working with the Royal Free in anticipating acute kidney injury. Like knights on white chargers, DeepMind has financed the digitisation of millions of patients’ data and produced algorithms that are already making a major contribution to improving difficult-to-diagnose conditions. It has cost the Royal Free next to nothing and, unsurprisingly, its staff are over the moon. What they do not realise is that the algorithms produced by DeepMind have international value and will be monetised all over the world for the benefit of Google, not of our NHS.
DeepMind and companies like it are swarming all over the NHS. For my part, to put it bluntly, I want to stop them gathering the benefits of our data on the cheap. My new amendment would water down previous amendments that your Lordships agreed to on Report—an amendment that the Commons in its infinite wisdom decided to annul. Frankly, I am still at a loss to understand why a Conservative Government would not want to maximise this goldmine; I always thought they were the party of business.
I have, however, taken on board the points made by the Information Commissioner. She said the amendments went beyond her powers. I have reduced them to a minimum. In substitution I have inserted a requirement for the Secretary of State to require the National Audit Office to prepare a code of practice for data controllers, for guidance on how to obtain best value in relation to the commercial exploitation of personal data of national significance, and for the NAO to report annually to Parliament on the commercial explication of the very same data.
The Minister and his team have listened to what I have had to say and I am very grateful for his kindness and attentiveness. Our last meeting was very helpful, and I look forward to him confirming the points that were made. I beg to move.
My Lords, I support Amendment 53A, moved by the noble Lord, Lord Mitchell. In doing so, I wish to make two specific points that follow on from his speech today. First, the amendment crucially recognises the importance of measuring what we as a nation are doing with data of significance before we take important, industrially strategic decisions on how we make the most of this vital national resource.
The noble Lord and others have made the analogy of data as the new oil. That analogy works particularly well for personal data as, like oil, it is potentially as toxic as it is valuable, and it must be carefully handled and not allowed to be released into the environment without due care. If we are to best manage, protect and distil it, we must first learn where and how it is being moved, used and commercialised. Can we as a nation easily answer the question that we are asking of Facebook or the former Cambridge Analytica: how much data are we commercialising at home and abroad, and to whom? If not, why not? Progressive and young, emerging nations are reviewing how they use their national data for national advantage, and we must make a concerted effort to do the same.
My second point is how the amendment therefore recognises that this measurement should be done centrally, not burdening already stretched government departments with developing their own approaches. While these departments must remain involved to provide domain insight into certain data types—for example, health and social care—the National Audit Office or other bodies should take charge of a cross-departmental process for measuring and tracking these flows of significant and valuable data. In this way we should be able to develop a consistent, coherent view of how we are handling our data reserves, which will give us the best possible evidence upon which to base our decisions on a secure approach to maximising their impact for our future national good. I therefore hope the Minister will be able to shed some light today on how this process is being thought through.
I will find out what was said. We should deal with what the GDPR calls special categories of data very sensitively. We should take data on health, sexual orientation, ethnicity and things like that very seriously. That is what the GDPR does and we will continue to do it under the Bill.
Finally, I return to the Commons amendments. I am afraid we still cannot support Amendments 53A and 53B as, at the moment, we believe that they are fundamentally the wrong solution. However, I hope that the productive discussions, to which the noble Lord, Lord Mitchell, referred, along with what I have said today, have convinced the noble Lord that our vision is aligned and that he finds sufficient reassurance in these words, and the written assurances that he has had from my noble friend Lord O’Shaughnessy, to be able to withdraw his amendment.
I thank the noble Lord for his very helpful comments. I also thank my noble friend Lord Freyberg, who has been with me all the way on this and given me huge support, and the noble Baroness, Lady Jones, for her comments. On the Front Benches, the noble Lord, Lord Clement-Jones, has always been a supporter and, at this particular point, the noble Lord, Lord Stevenson, has guided me through the intricacies of ping-pong, which I was not aware of.
I have heard what the Minister has said, and have received a letter from the noble Lord, Lord O’Shaughnessy. It is the end of the football season. We are now in extra time; we are still at a draw and could be facing penalty shoot-outs, but I am going to decline that. I beg leave to withdraw the amendment.
(6 years, 11 months ago)
Lords ChamberMy Lords, I too thank my noble friend Lady Kidron for introducing this timely and important debate. However, I feel such a hypocrite. I live on many screens: I tweet and I follow, and I delight in Amazon—no more trips to the shops. As for Google, where would I be without it? The only major social media I do not use is Facebook—there I draw the line. Truth be told, I love the products that these companies provide, and yet I am so critical of these very same companies. Yesterday in your Lordships’ House I spoke about protecting public data assets from big tech. Today, I want to speak about big tech’s lack of corporate and social responsibility.
The fact is that these companies—Apple, Amazon, Google, Twitter and Facebook—have become the colossuses of our 21st-century world. They stand astride our economies and our social interchanges. Their corporate power is, quite frankly, scary, and the influence of their products on society can be devastating—just look at the recent US elections. It was Google that coined the phrase, “Do no evil”, a mantra that could just as well be applied to any of the other participants. They believe that they exist for the benefit of mankind. My view is different. I am truly worried by the power these companies wield, but before I turn to social media, I should like to address the related issue of their universal obsession in avoiding paying tax.
Why is it that I talk about tax avoidance in a social media debate? Because big tech companies are driven not to pay tax, with the same fervour as they are driven not to take responsibility for the content that appears on their platforms. The same organisations that employ the brightest people in the world to design their products, enhance their systems and create their algorithms also employ the cleverest people on this planet to ensure that they pay little or no tax. Billions, perhaps even trillions, of dollars of untaxed corporate profits are squirrelled away in Luxembourg, the Cayman Islands and the like—the result of convoluted international structures set up with one purpose only. Maybe their mantra should be modified: “Do no evil, pay no taxes”.
This afternoon’s debate centres on the responsibilities of the social media companies, whether such companies are platforms or publishers, and how they should be regulated. For too long, YouTube, Twitter and Facebook have positioned themselves as platforms—conduits of data—with no responsibility for their content. These days, few believe that. We continue to be shocked by the vile words and images that the world is able to access on these platforms. No newspaper, however extreme it may be, would ever dare publish the lies and images that we see on social media, but even now little is done to control these companies.
I am certain of one thing: if the social media companies and big tech really wanted to clean up their act, they could do it. The genius that created these amazing organisations and the accumulation of talent and resource that they now have at their hands is unparalleled in history. All of this could be harnessed to clean out their stables. They could become good corporate citizens. All they need is the will, or perhaps the legislative imperative, for without laws, it is clear that anything goes.
(6 years, 11 months ago)
Lords ChamberMy Lords, I will also speak to Amendment 108. The points I am addressing were glossed over in Committee, and I now wish to expand on this important issue.
Data is the new oil. This has been said many times in your Lordships’ House, but as each day passes it becomes more true. Without stretching the analogy too far, in our country big data is about to become the 21st-century equivalent of North Sea oil. Because big data has such value, it will come as no surprise to see big tech companies swarming all over it. They have to because it is their lifeline. Many of our public bodies, particularly the NHS, are custodians of massive amounts of data, which big tech is eager to get its hands on. But we as legislators who act for the public good also have a responsibility to ensure that the public are protected and that, simply put, our treasure is not taken from us without clear authority or appropriate recompense. The data the public bodies hold belongs to us all. It is ours—our communal property—and we must tread carefully.
I will make one point as strongly as I can. I am a product of the data revolution; I have been professionally involved in the digital industry for over 50 years. For 40 of those I was an IT serial entrepreneur. This industry has been good to me; I fully understand that the tech sector needs light regulation. I know that at its best the digital revolution is a force for good but, equally, I know the dangers it poses, so I am trying to be cautious in what I propose. We stand at a crossroads. Computing power has reached astronomical capabilities, software is increasingly complex and artificial intelligence is now making dramatic inroads. Plus, we see the exponential availability of digital data. All these have contributed to the creation and brilliance of algorithms. The one thing we know for certain is that these exciting developments will keep on growing at exponential rates. In medicine, for example, new tools are being developed that are already enhancing diagnostic and treatment capabilities that could benefit all manner of healthcare, in particular our ageing population.
I welcome these developments, as I am sure we all do, many of which have come from our own private sector, and we should rejoice at this example of British expertise. However, at the same time we need to strike a balance between the ambitions of 21st century businesses and the responsibility of government to steward assets and resources of national significance so that the proceeds of technological developments benefit us all. My two amendments seek to codify how valuable, publicly controlled personal data is shared with big tech companies, and to ensure that financial returns, combined with wider social, economic and environmental benefits, are optimised.
I can best demonstrate the scale of this issue if I refer to the NHS. Ever since its formation in 1948—maybe they were kept even before that—the NHS has kept records of tens of millions of patients, literally from cradle to grave. These records are either in written form, or increasingly in digital format, but the magnitude of the collected data is huge. Very few countries can match the length and depth of the health records that the NHS is trusted to retain on behalf of the general public. Such data is called longitudinal data and, when it is bundled together, has great commercial value.
At Second Reading I gave the example of a company called DeepMind, which is a British subsidiary of Google. I visited DeepMind, which is an impressive organisation based here in London. It has purchased access to millions of anonymised data records from institutions such as the Royal Free and Moorfields Eye Hospital. It does not buy this data outright—it does not have to. It simply buys access. Such access enables it and companies like it to use very powerful computers and very sophisticated software to process millions of records with the help of artificial intelligence and machine learning.
This synthesising of data using AI capabilities is designed to produce algorithms, and it is these algorithms that become the product that companies such as DeepMind are able to monetise. They do this by selling the algorithms and their consulting services to the likes of pharmaceutical companies and healthcare providers and even back to the NHS itself. It is a global business and very profitable. At the Royal Free, these algorithms are being used to detect the early onset of kidney disease. At Moorfields Eye Hospital, also here in London, spectacular advances have occurred in similarly detecting potential optical problems.
This is data processing used for the benefit and enhancement of all mankind and we should welcome it. However, I am concerned that this precious and unique data is being offered to big tech companies by our public bodies in the absence of clear and consistent guidelines and without asking how best to obtain value for money in the broadest sense of the term.
Having dealt with big tech companies for most of my life, I know that they are staffed with exceptionally clever people and are no slouches at driving hard bargains. Unlike our NHS, they are not consumed with the day-to-day preoccupation of trying to balance their current budgets; with hundreds of billions of dollars in the bank, they can afford to play the long game, and it is easy to see who holds the aces in any negotiation. Put simply, I wish to protect our public bodies and ensure that we do not give away our inheritance. That is why we need to codify how we will obtain value for money from the sharing of data of national significance with the private sector.
My proposal is not just for the NHS and it is not just for now. All public bodies need protection and guidelines today and well into the future. That is why I have introduced my amendments. In Amendment 107B I seek, first, to require the Information Commissioner to maintain a register of publicly controlled personal data of national significance and, secondly, to prepare a code of practice containing practical guidance in relation to personal data of national significance. These are defined in subsection (2). In Amendment 108 I have set out the requirements of the code on personal data of national significance.
My Lords, I want briefly to express sympathy with the noble Lord, Lord Mitchell. I share many of his concerns but essentially I think that we should look on the most optimistic side. I hope that he is also really describing the opportunities that can be made available with this kind of data, provided that it is accessible in the way described. I know that the noble Lord takes considerable inspiration from Future Care Capital’s report on intelligence-sharing unleashing the potential of health and care data in the UK to transform outcomes. I thought that it was very good and well considered.
The noble Lord has put down a very important marker today but my one caveat is that I am not sure that there is yet a settled view about how to deal with this kind of data. In Committee we talked about data trusts. In her AI review, Dame Wendy Hall also talked about data trusts. I know that we need to head in a direction that gives us much more assurance about the use of the data in the way that the noble Lord, Lord Mitchell, has described, but I am not sure we have quite reached a consensus around these things to come to the decision that this is the best possible model.
My Lords, I am grateful to the noble Lord, Lord Mitchell, for taking the time to come and see me to explain these amendments. We had an interesting conversation and I learned a lot—although clearly I did not convince him that they should not be put forward. I am grateful also to the noble Lords, Lord Clement-Jones and Lord Stevenson, who said, I think, that there may be more work to do on this—I agree—and that possibly this is not the right time to discuss these issues because they are broader than the amendment. Notwithstanding that, I completely understand the issues that the noble Lord, Lord Mitchell, has raised, and they are certainly worth thinking about.
These amendments seek to ensure that public authorities—for example, the NHS—are, with the help of the Information Commissioner, fully cognisant of the value of the data that they hold when entering into appropriate data-sharing agreements with third parties. Amendment 107B would also require the Information Commissioner to keep a register of this data of “national significance”. I can see the concerns of the noble Lord, Lord Mitchell. It would seem right that when public authorities are sharing data with third parties, those agreements are entered into with a full understanding of the value of that data. We all agree that we do not want the public sector disadvantaged, but I am not sure that the public sector is being disadvantaged. Before any amendment could be agreed, we would need to establish that there really was a problem.
Opening up public data improves transparency, builds trust and fosters innovation. Making data easily available means that it will be easier for people to make decisions and suggestions about government policies based on detailed information. There are many examples of public transport and mapping apps that make people’s lives easier that are powered by open data. The innovation that this fosters builds world-beating technologies and skills that form the cornerstone of the tech sector in the UK. While protecting the value in our data is important, it cannot be done with a blunt tool, as we need equally to continue our efforts to open up and make best use of government-held data.
In respect of health data, efforts are afoot to find this balance. For example, Sir John Bell proposed in the Life Sciences: Industrial Strategy, published in August last year, that a working group be established to explore a new health technology assessment and commercial framework that would capture the value in algorithms generated using NHS data. This type of body would be more suitable to explore these questions than a code of practice issued by the Information Commissioner, as the noble Lord proposes.
I agree that it is absolutely right that public sector bodies should be aware of the value of the data that they hold. However, value can be extracted in many ways, not solely through monetary means. For example, sharing health data with companies who analyse that data may lead to a deeper understanding of diseases and potentially even to new cures—that is true value. The Information Commissioner could not advise on this.
That sharing, of course, raises ethical issues as well as financial ones and we will debate later the future role and status of the new centre for data ethics and innovation, as the noble Lord, Lord Stevenson, mentioned. This body is under development and I am sure that this House would want to contribute to its development, not least the noble Lord, Lord Clement-Jones, and his Select Committee on Artificial Intelligence.
For those reasons, I am not sure that a code is the right answer. Having heard some of the factors that need to be considered, I hope the noble Lord will not press his amendment.
Perhaps I may offer some further reassurance. If in the future it emerged that a code was the right solution, the Bill allows, at Clause 124, for the Secretary of State to require the Information Commissioner to prepare appropriate codes. If it proves better that the Government should provide guidance, the Secretary of State could offer his own code.
There are technical questions about the wording of the noble Lord’s amendment. I will not go into them at the moment because the issues of principle are more important. However, for the reasons I have given that the code may not be the correct thing at the moment, I invite him to withdraw his amendment.
My Lords, I thank all noble Lords for their contributions to this short debate. I also thank the Minister for agreeing to see me prior to the Recess and for his comments today. However, this is an issue of precision—and we need precision on the statute book. All that has been suggested to me, which is that it can be found elsewhere or will be looked at in the future, does not give the definitive answer we require. That is why I would like to test the opinion of the House.