(5 years, 5 months ago)
Lords ChamberThat this House takes note of the Report from the Communications Committee Regulating in a digital world (2nd Report, HL Paper 299).
My Lords, I have the privilege to introduce this debate on the report of the Communications Committee. I do so as chairman of that committee. I am most grateful to the staff of our committee for their assistance in preparing the report: Theo Pembroke, the clerk; Niall Stewart and Theo Demolder, the policy analysts; and Rita Cohen, the committee assistant. They have turned their excellent minds to a whole range of complex issues and have given the committee first-rate advice. I also thank Professor Andrew Murray of the LSE, who provided in-depth and expert advice throughout this inquiry. Of course I thank the members of the committee, who have brought have brought great expertise, experience and insight to this study of a complex and vitally important area of public policy. I declare an interest as a freelance consultant to Finsbury, a PR company, and I am an electoral commissioner.
The internet has enabled people and organisations to communicate, participate in society and democracy, and to transact business on a scale which would have been unimaginable only a couple of decades ago. However, regulation has not kept pace with the nature and scope of the digital services which now affect the lives that we live. A large volume of activity occurs online which would not be tolerated offline, including abuse and hateful speech. A handful of very large tech companies have come to dominate the environments in which they operate, buying up potential competitors. Self-regulation by online platforms is inconsistent, unaccountable and inadequate, so there is a compelling and urgent case for further regulation.
What is needed is not just more regulation but a new approach to regulation. More than a dozen UK regulators have a remit covering the digital world, but no single body has complete oversight. Regulation of the digital environment is fragmented, with gaps and overlaps. Problems are neglected until they become emergencies. Policymakers offer knee-jerk responses to media stories which may have unintended consequences. One of our witnesses described this as “regulation by outrage” and compared it to whack-a-mole. Regulation needs to be better co-ordinated, more consistent and in line with the public interest.
In our report we set out proposals to ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of services. UK regulators have a world-class reputation and help to make the UK an attractive place for business. Tech companies should work with regulators to build well-considered, stable regulation which leads to consistent and predictable outcomes. There is a great opportunity for the UK to benefit from the soft power that comes with the international reputation of its regulators and for tech companies to be part of a programme of thoughtful, measured reform.
We had two key recommendations which shaped our report. First, we recommended the creation of a new digital authority to co-ordinate regulators and to identify and address gaps in regulation. Its board would consist of the chief executives of the relevant regulators with independent non-executives. It would work, crucially, with Government, Parliament and civil society to draw up priorities and work across its component bodies. It would continually assess the regulatory landscape and from time to time make recommendations on what new regulatory powers were needed.
The authority would also play a vital role in providing the public, the Government and Parliament with the latest information on technological developments. Under our vision, there would be an important role for Parliament in monitoring progress and responding where regulatory gaps are identified by granting new powers as necessary. To this end, we proposed a new Joint Committee of Parliament with a remit to consider all matters related to the digital world. This will enable Parliament to maintain democratic scrutiny over the regulators. The work of this Joint Committee would be informed by the digital authority, which would regularly report to it.
In their response to our report, the Government state that they aim to provide co-ordination and oversight through their digital charter programme. They note several initiatives to strengthen the regulation of digital technology, including the work of expert reviews. However, our concern is that implementing the recommendations of each of these separate pieces of work could further fragment the regulatory landscape. Many reviews and reports have recommended new regulators. However, we believe it is time for co-ordination, not proliferation, of regulators. That is why we propose the digital authority as a forward-looking, horizon-scanning body that consolidates and supplements what is already there. We think the horizon-scanning role is vital, enabling us to get ahead of technology changes that will affect our society and to design, with the industry, public policy solutions to address emerging risks.
For those who worry about the impact of regulation on innovation and freedom of expression I argue that, by anticipating the future impact of technological development, regulation is likely to be more proportionate and considered. With much greater co-operation from industry in the process, credible solutions at the design stage are also more viable. That is why we see the digital authority as a UK centre of expertise that can support our regulators, Government and Parliament and attract talent that is so often poached by the big tech companies.
Our second key recommendation was that all online regulation should be underpinned by 10 principles, including accountability, transparency, respect for privacy and freedom of expression. These principles would help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment that is beneficial to all. Responding to our report, the Government said that these were aligned with the principles set out in their online harms White Paper. However, we argued for principle-based regulation that is flexible and seeks to ensure appropriate outcomes. This is necessary in the fast-changing world of the internet. Our principles are not just an aspiration for what regulation should look like; they are intended to inform both the development of policy and the implementation of regulation. In each case, they cannot be taken separately; policymakers and regulators must consider them together, carefully balancing competing factors such as regulation and innovation, online safety and freedom of expression.
The Government have done considerable work to address online harms through their White Paper. I welcome efforts to introduce robust regulation, but our concern is that they appear to be doing this in isolation from other work. To begin, we noted that questions of design are at the heart of how the internet is experienced. It affects how users behave online and how decisions are made about them. The architecture of many online services is designed to capture users’ attention so that their data—essential to the business models of most of the large tech companies—can be extracted. Personal data are processed using black-box algorithms that are not transparent. Extraction is not limited to data that users upload; behavioural data are gleaned from users’ online activity. We recommend that users should have greater control over the collection of personal data; maximum privacy and safety settings should be the default.
The Government noted that the general data protection regulation addresses a number of these points, but this law is new and its application untested. We identified grey areas that the Government should clarify, such as inferred data. We also suggested ways to increase transparency and accountability in line with our principles. For example, we recommend that data controllers and data processors should be required to publish annual data transparency statements detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred. This is quite different from the privacy statements that currently exist.
The internet presents challenges to competition law. Digital markets develop quickly, whereas the competition regulator relies on meticulous and ex post analysis. There is widespread concern that competition authorities place too much emphasis on price. Meanwhile, the digital economy is characterised by the concentration of market power in a small number of companies that operate online intermediary platforms. These platforms benefit from network effects to gain dominant positions in their respective markets. Some of them provide services to consumers without charge. As intermediaries in markets they can shift costs from consumers to suppliers, while both are dependent on them. Information gained from direct access to consumers also gives platforms a competitive edge, and they have huge big data sets. There is concern that they use this information to identify and buy up emerging competitors. The Government should consider creating a public interest test for data-driven mergers and acquisitions. To deliver this, the digital authority could help co-ordinate the work of the Competition and Markets Authority and the Information Commissioner’s Office, both of which gave us thoughtful evidence.
Regulation should also recognise the inherent power of intermediaries. Greater use of data portability might help, but this would require more interoperability. I welcome the review by Professor Jason Furman that explored these issues and made recommendations. The noble Lord, Lord Tyrie, has called for the Competition and Markets Authority to have greater powers to regulate in the interests of consumers; I look forward to his contribution to this debate. Technology companies provide venues for illegal content and other forms of online abuse, bullying and fake news. Although they acknowledge some responsibility, their responses are not of the right scale to deal with the problem.
The Government’s proposal to introduce a duty of care accords with our recommendation. However, we did not wish to recommend a new regulator to enforce this duty. We recommend that, at least initially, Ofcom should be responsible for enforcing the duty of care. In so doing it should focus on the process for dealing with online harms rather than on content or on specific instances of wrongdoing. Big platforms should invest in better moderation processes. They should be held to the standard they set out in their own terms of service. We also recommend that online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. I would be grateful if the Minister could respond to that recommendation. I also ask him for assurances about the impact of the duty of care on the press and how the Government intend to balance journalistic freedom with regulation of online harms.
Looking at the speakers’ list, I know we will enjoy a thoughtful and fascinating debate this evening and look forward to noble Lords’ contributions. I beg to move.
My Lords, it is a great pleasure to follow the noble Lord, Lord Gilbert, and to have the first opportunity of thanking him for his chairmanship of our committee—a task he carried out with great skill, bearing in mind that the landscape we were surveying was changing as the report was being written. I also echo his thanks to the clerk of the committee, Theo Pembroke, and his team, and to our specialist adviser, Professor Andrew Murray.
I agree with the noble Lord, Lord Gilbert. The internet has already brought huge benefits to society: shopping, travelling, research and even managing one’s own health and fitness are easier. But in some ways, the awe and wonder at the things the internet could do for us that greeted its initial arrival seem to have worn off and to have been replaced by a degree of mistrust. The feeling is that somehow the internet is trying to control us; people imagine a Nineteen Eighty-Four scenario coming about. The feeling is that we are being manipulated by internet companies, that our data is being taken from us without our knowledge, misused and sold on, and that it has all got slightly out of control. Now there is a danger of legislation overreacting, and I hope the House will agree that we should adopt the same approach as the title of a seminar I intend to attend next week: How to Regulate the Internet Without Breaking It.
Undoubtedly in some countries, legislation is being proposed which would not only regulate the internet but stretch its resilience to breaking point. If we take, for example, the GDPR, as the noble Lord, Lord Gilbert, alluded to, it has not yet been fully implemented. We do not know how its interaction with the internet is going to work—we all hope it will work well.
Another reason why the gloss has come off the internet slightly is that it has made two big enemies in recent years. One is the advertising industry, which is extremely powerful and suddenly felt it was being rather defrauded by internet companies. They must make their peace with the advertising industry, and they will be set a high bar of improvement before they are let off the hook. The other body that has turned against the internet is the press, understandably. The press has largely been put out of business by the internet, so any misdemeanours by internet companies are not short of publicity in our newspapers.
As the report says in its very title, we live in a digital world, and it is important that we recognise that digital is part of that world and should not be treated separately from it. I would be interested, for example, in at some point having a debate on the Competition and Markets Authority inquiries. If it is inquiring into high street retailers, should it include Amazon? If it is inquiring into private transportation, should it include Uber? If it is inquiring into hospitality and hotels, should it include Airbnb? All of these are major players, yet are somehow classified as separate from the functions they exercise. That does not seem to make very much sense.
The approach to regulation should be the one suggested by the noble Lord, Lord Gilbert, and the committee. We need a principles-based approach which is nuanced, because we face difficult decisions. If we take, for example, the issue of anonymity, some people looking at the abuse on social media by people who remain anonymous say that the situation could be remedied by forcing everyone who makes a statement online to leave an address at which they can be traced, so that action and redress can be sought by the person who has been offended. This is a very good idea in its own right, but what effect would it have on whistleblowers who perhaps live under regimes where they would be exposed, even to loss of life? It is going to be difficult to get it right, and knee-jerk reactions are not appropriate.
Ideally, we need a blend of the stick and the carrot. The stick is the threat of legislation, and as the noble Lord, Lord Gilbert, has outlined, we have suggested a digital authority with new, real powers to enforce statutory regulation if required. Clearly preferable, if we could achieve it, would be self-regulation and co-regulation. The idea of ethical design will not work unless the companies embrace the 10 principles we have put forward and recognise that they are in their own self-interest, as well as the public interest. Self-interest should also lead them wholeheartedly to adopt the principles that we have advocated. Unless there is trust in the internet, people will not yield their data for use, and the internet business model of a great many companies will be out of the window.
In recent days, I have been in contact with the Internet Advertising Bureau, and there is further progress with its gold star system. After giving evidence to us, it emerged that the number of companies taking part has gone up from 52 to 105, and the number of people certified has risen from 12 to 91. The gold star is awarded only if they pass a reasonable standard in avoiding ad fraud, producing transparency and preserving brand positioning. If the regulation is inadequate, it is up to the advertising industry to drive a harder bargain. Likewise, in other forms of regulation, it will be up to the digital authority to say that the bar could be set a little higher. With gentle nudging—once we have got the issue of regulation accepted as a principle—where the bar is put is a matter for negotiation. If we can get 95% of what we are looking for by self-regulation, it is worth forgoing the other 5% unless it is absolutely vital; but others will differ, and there will be some issues where we need 100% support.
Turning to the powers of the digital authority, I do not want to go over what the noble Lord, Lord Gilbert, has put forward, but rather link it to the debate we have just had on Henry VIII powers, abuse of powers by government and statutory instruments. This is another area where our suggested model could be of some help. Our idea is that there should be a digital authority answerable to the highest level in government. Where that is set is up to government, but it has to be somebody who can call the shots: rather than simply asking the health service to do something, it has to be a body powerful enough to tell the health service to do something—or even the Treasury, which, as your Lordships will recognise, would be a constitutional breakthrough.
The Joint Committee of both Houses, apart from producing a quarterly report on the landscape, would have the power to say that there is an urgent problem which needs dealing with. Let us be honest, Parliament is hopelessly inadequate to deal with the internet. By the time legislation is introduced and passed, a year will have gone by with no bother, the landscape will have changed and evasive action will have been taken. We need action to be taken quickly, and if the committee of both Houses were to endorse giving the Minister the powers to deal with the problem, it would carry a lot of weight in both Houses. They would feel that it was not simply a question of giving the Minister Henry VIII powers, but that those powers had been subject to some degree of scrutiny.
If we can get this right, it could be life-changing for our country. The internet is, in its own way, a much more important invention than printing, because of its interactivity. If we can get it right, and if we can align public interest and self-interest, we will harness one of the great positive forces for good in our society.
My Lords, I too want to say what a great honour—and, indeed, an education—it has been to serve on the Communications Select Committee for this House, and to have had a small say in the production of this important report. It is always a great joy to follow the noble Lord, Lord Gordon, and, indeed, the noble Lord, Lord Gilbert, who has chaired our committee with such wit and patience.
The Government have already committed themselves to making the United Kingdom the safest place in the world to be online. The ideas in this report explain that this does not necessarily require more regulation, but a different approach to regulation. It is not an exaggeration to say that this is one of the big moral challenges of our day. We need to get it right, especially for our children, for there are no longer two worlds, the online and offline, but the one digital environment that we all inhabit and that needs a set of principles to govern not just its oversight but its future development.
When I take my child to the park or cinema, go to a restaurant, travel by public transport, go shopping or, to escape the hustle and bustle of either this place or my day job, lie on the beach or snooze on a park bench in Parliament Square, those who own and manage these spaces have responsibilities to those of us who use them. These responsibilities are laid out in legislation overseen by various different bodies. However, behind it is the principle that we have responsibilities of mutual care and respect. If the salad Niçoise I order in the restaurant is dressed with bleach, or the film has no guidance about the appropriate age for a child to watch it, or the deck chair I hire gives me splinters in unmentionable places—I will not say what I was going to say; I have to remember I am a Bishop; with this outfit it should not be difficult—those who have responsibility for the space are liable.
We have a phrase for this in the English language: common sense. However, common sense is rooted in a thoughtful and developed moral tradition whereby we recognise our common humanity and resolve to live by an agreed set of principles and standards. The digital world cannot be exempt from this moral framework. Neither is it sufficient for regulators to mitigate and alleviate its worst excesses. Why should we have to ask Facebook to take things down? Would it not be better if they were never put up in the first place?
This need not curb free speech. In fact, in the ever-increasing world of fake news and all the rest of it, it might be the salvation of free speech—for freedom is not freedom to say what I like and do as I please without regard to others, but to be free to do what we must to serve the good of all.
Self-regulation is manifestly failing. A few powerful companies dominate the digital landscape. They say they are platforms, with little or no responsibility for those who walk upon them, but they are actually public spaces with a duty of care to those who enter. I am pleased that the Government have embraced that concept but they seem reluctant to fully embrace the principles-based approach to the internet that this report recommends. The principles-based approach is, yes, to regulation but also, critically, it is to policy and development so that we might create a different future. We believe that will require an overarching body, as the noble Lord, Lord Gilbert, has explained, that we call the digital authority. However, if we do this it might break the Gordian knot of a tangle of competing bodies and rules and therefore hold the possibility of the UK taking a lead on an issue that is significantly rising up the agenda of public concern and we ask the Government to look at it again. Those clever algorithms which are so good at selling us stuff could be used to design the internet differently. What is now required is the political will to make it happen.
Finally, I remind the House of the regulations which have already been introduced. Several bishops, including myself, recently wrote to the Information Commissioner, Elizabeth Denham, in support of what I think is called—the noble Baroness, Lady Kidron, will correct me later if I have got it wrong—the kids’ code that has been put forward in draft. We made the point that the online world shares the offline world’s ethical duty to differentiate between children and adults and to respect and protect both the vulnerable and the marginalised in the digital world.
We cited the example—hey, we are bishops and this is the way we do things—of Jesus’s most famous story of the Good Samaritan, where the Good Samaritan crosses boundaries in order to transcend the normal ways in which we do things in the different social and political jurisdictions we inhabit. Likewise, the tech sector and the digital world need to accept the demands of responsibility above profitability and to acknowledge their corporate responsibility to uphold the common good. We are concerned that the Government may row back from their commitment to introduce this code and fulfil their responsibility to children.
In the coming days, we will write to the Secretary of State on this matter. However, importantly, for this debate today, let us not keep reimagining a better future without also grasping the opportunity for that future to start today.
I too thank the noble Lord, Lord Gilbert, for his heroic chairing of this lengthy and complicated inquiry. I also thank the clerk, Theo Pembroke, and the specialist adviser, Professor Murray, for gathering a distinguished array of witnesses and shaping this report, of which I and other members of the committee are justifiably proud.
I want to concentrate my comments tonight on chapter 3, on ethical technology. The committee put some energy into understanding the role of algorithms in the digital world and the problems that might arise from unregulated artificial intelligence. I have been particularly struck by the evidence given by witnesses such as Professor John Naughton, who told us that the wider community, including government and industry, were dazzled by technology. He warned:
“We always have to be prepared to apply to it the standard levels of human scepticism that we apply to everything”.
In the report, we raised the awareness of the many concerns surrounding AI decision-making. The committee responded with recommendation 6, calling on the Information Commissioner’s Office to set out rules for the use of algorithms in accordance with the principles laid out in chapter 2. We also recommended that the ICO publish a code of practice on the use of algorithms.
The GDPR is supposed to ensure that any data processing is transparent, fair, avoids bias and discrimination. The Data Protection Act, passed last year in May, enacts these requirements in English law. Yet, despite the DPA, recent surveys show that people are still concerned about the use of algorithms. They are worried by what kind of data is selected to influence the algorithmic decision, the accuracy of the algorithms being used and whether they are fair and not affected by bias and discrimination.
The ICO’s interim report, Project ExplAIn, published last week, attempts to lay the basis for ethical guidelines in AI decision-making. It explains that there is a distrust by many digital organisations of transparency in AI decisions. They fear it may lead to breaching commercial sensitivities, infringing third-party data and their programmes being gamed by users. However, these concerns need to be set against individuals’ requirements for organisations to give appropriate detailed explanations of AI decision-making. The report suggests that there is space to help bridge this divide and help organisations to foster a culture of informed and responsible approaches to innovation in AI technologies.
This work sounds like a good basis for the ICO to publish draft guidelines on ethical designs in July, with final publication in October. These will go a long way to ensuring that there is improvement in the accountability of AI decision-making. I encourage the Government to ensure that these guidelines are in line with the principles set out in the report. Even so, they will be only guidelines. However well thought out they might be, I fear the digital world will always harbour organisations and individuals who do not want to abide by them.
The GDPR is limited. Article 22 of the GDPR and Section 14 of the Data Protection Act adopt suitable measures to safeguard individuals when using solely automated decisions. This allows data subjects to appeal against an AI decision only when it is fully automated and there is no human involved. However, once human involvement in this decision is determined, the data subject cannot appeal. As many AI decisions are augmented by human intervention, this seems to be a loophole. Do the Government plan to plug this loophole and ensure that relevant legislation is brought forward to deal with any potential problem arising from this?
Ethical design is also relevant to my other great concern, raised in the report in paragraph 82, under the heading “Capturing attention”. It points out that digital companies are driven by the commercial imperative to seek and retain users’ attention. The EU Competition Commissioner, Margrethe Vestager, warns that this can lead to a form of addiction. On Monday, Barnardo’s issued a report expressing concern that children’s early access to electronic devices could lead to both addiction and a loss of key social skills as families spend less time talking to each other. This could cause the children problems with mental health and emotional well-being. The committee’s report anticipates these concerns and recommends that digital service providers, including entertainment and games platforms, record time spent using their service and give users reminders of extended use through pop-up notices.
In the debate on the online harms White Paper on April 30, I said that I was concerned that this problem was not being taken seriously by the Government. The White Paper says that the CMO’s review, which covered online gaming and internet addiction, did not find evidence of a causal relationship between screen-based activities and mental health problems. The White Paper shockingly concludes that the evidence did not support the need for parental guidelines or requirements for companies to behave responsibly in this area.
This lack of action is made particularly serious by the failure to confront the growing problem of gaming addiction, which affects so many young people, especially young men. Policymakers and psychologists across the developed world see this as an issue that needs to be addressed now. However, the White Paper almost ignores it.
In his reply to my April speech, the Minister said:
“I completely agree with what was said about the resistance of the gaming sector, in particular, to engage with this issue”.—[Official Report, 30/4/19; col. 933.]
He gave me his support, for which I was very grateful. It is now six weeks later. Can the Minister give me some assurance that the Government are working to ensure that the gaming industry’s resistance to dealing with gaming addiction will be seriously addressed? Failure to confront this issue quickly and comprehensively will lay the foundations of social and mental problems for generations to come.
My Lords, I will begin by commending and congratulating the Communications Committee and its chairman, my noble friend Lord Gilbert, for an excellent and very far-sighted report. I should declare my own interest: I was a trustee of Doteveryone until recently, and the chief executive of TalkTalk less recently.
I have personally campaigned for balanced internet safety regulation for a long time. I passionately believe in the good that the digital world is bringing to society. I also believe in free markets and competition driving that good. However, it is clear that we also need to have a civilised digital world and that it needs regulation to protect the vulnerable and to ensure a level, competitive playing field in order to continue driving innovation.
That position has felt quite a lonely place for quite a long time, with many of my fellow tech leaders arguing strongly that liberal markets will solve these problems; that the internet should be a completely open, unregulated space; or that no regulation is possible, because technology is moving too fast. On the other hand, campaigners have argued for blanket bans and blocks. I am therefore delighted to see—in this report, in the Government’s response in the online harms White Paper and in views expressed on both sides of the House, in this Chamber and in the other place—that there is a growing consensus that self-regulation of the digital world is not enough and not working, and that we need regulation that is thoughtfully designed across a whole range of potential social and economic harms.
I am particularly pleased to see agreement on legislating to create a statutory duty of care. That puts into practice the first principle that the committee’s report sets out: that we need to look for parity between the offline and online world wherever possible. A statutory duty of care that, in a sensible and balanced way, puts the onus on organisations to look after their customers and stakeholders seems to me a fantastic way forward, and we have plenty of offline precedent to guide us in our online regulation.
I would also like to congratulate the committee on its work in setting out a principles-based approach to regulation; its 10 principles are excellent. Why are some of those 10 principles not replicated in the Government’s thinking in their response? It seemed to me that they are a balanced and comprehensible set of guidelines for us to shape regulation for the future.
I would like to move to an important issue raised by my noble friend Lord Gilbert, on which I am less convinced that there is consensus: whether we should be addressing digital regulation piecemeal in each different part of society as it arises, or in a co-ordinated and more strategic way. In business, almost every large historic, physical, non-digital business has worked out that you need to bring digital leadership into one place for at least a period of time—you need to bring together all the teams looking at driving change on your digital agenda if you are really going to get momentum. It does not need to be done for ever. I have tried it both ways in my business career—keeping it separate or pulling it together—and, if you really want to create a step change in a physical organisation that is learning about the digital world, you need to have an overarching digital strategy and a team of people who specialise in looking at all the interconnectivity of these different digital issues.
It seems to me that the recommendation in this report to create the digital authority does exactly that in our physical society as we learn to integrate it with digital. The skills are too limited to keep them spread and the issues are too overlapping. It requires a different way of thinking from the old physical world. In all my experience in business, if you organise that together, you will get an acceleration of thinking and learning that can then be embedded in all the different parts of the system.
I think the committee is really on to something here. I am concerned that the Government do not appear to agree and instead prefer a more fragmented approach, creating additional regulators—which, as a good liberal Conservative, I do not like anyway—in what looks like an attempt to glue together this approach in a digital charter. To me, it looks more like a digital work plan than a charter, when compared with a statutory digital authority.
I am concerned for a number of reasons. First, I am worried that the digital charter is too close to politics. These are complex and technical issues that require a lot of detailed thought from experts who really understand the subject. Regardless of who is in charge in whichever Government we have, I am nervous about the digital charter being glued into the DCMS in a purely informal way. I also think it is too easily captured by powerful lobbyists. The tech industry is not separating out its approach to lobbying on digital regulation. Do not think for a moment that there are disparate teams working on online safety and online competition: there is one unified thought process coming through the tech industry. If we are going to get to the right, balanced answer, we should be doing the same.
It is dangerous to have your core digital strategy interwoven with an economic Ministry in DCMS. We are asking our DCMS Ministers and civil servants to be poacher and gamekeeper: to attract inward investment, but at the same time to create a fair, level playing field and safety net for the vulnerable.
Those are all reasons why, in principle, we should accept the recommendation of this report and establish a digital authority. I think we can see in practice why we should as well. Like the right reverend Prelate, I am concerned that the kids’ code—the age-appropriate design code—will get watered down through hugely effective lobbying from people who will tell you that it is impossible or that it should be very narrow. I am sure that the noble Baroness, Lady Kidron, will give us more detail on this when she speaks, so I will try not to steal her thunder. It is a great example of why, if we are not very careful, it is impossible to balance poacher and gamekeeper.
In conclusion, I would like to congratulate the Communications Committee and its chair on this excellent report, and to ask the Minister to reconsider the Government’s response and bring forward legislation to set up a digital authority and to implement the 10 principles set out in this report. I suspect that all of us in the Chamber this evening agree that this presents a real opportunity for us to do what we did in this country 150 years ago: to manage that balance between being open to innovation and protecting everyone in society as technological innovation gathers pace. This is a hugely exciting report and I am delighted to be part of the debate this evening.
My Lords, it is always a pleasure to follow the noble Baroness, Lady Harding, who, not for the first time, has beautifully articulated some of my points. But I intend to repeat them, and I hope that they will emerge not as stolen thunder but as a common cause, and perhaps a storm around the House as others speak also.
Since my time on the committee shortly comes to an end, I take this opportunity to record my personal thanks to the noble Lord, Lord Gilbert, for his excellent chairmanship throughout, and to pay tribute to my colleagues, who make our meetings so fantastically interesting, collaborative and, occasionally, robust. I also thank the clerk, Theo Pembroke, who has always met our insatiable curiosity with extraordinary patience and good humour. I draw the attention of the House to my interests as set out in the register, particularly as chair of the 5Rights Foundation.
In its introduction, Regulating in a Digital World offers the following observation:
“The need for regulation goes beyond online harms. The digital world has become dominated by a small number of very large companies. These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses”.
Having heard from scores of witnesses and read a mountain of written evidence, the committee concludes that regulatory intervention is required to tackle this “power imbalance” between those who use technology and those who own it. As witness after witness pointed out,
“regulation of the digital world has not kept pace with its role in our lives”;
the tech sector’s response to “growing public concern” has been “piecemeal”; and effective, comprehensive, and future-proof regulation is urgent and long overdue. It is on this point of the how the sector has responded to these calls for regulation that I will address the bulk of my remarks today.
Earlier this year, Mark Zuckerberg said:
“I believe we need a more active role for government and regulators. By updating the rules for the internet, we can preserve what’s best about it ... while also protecting society from broader harms”.
Meanwhile, Jeff Bezos said that Amazon will,
“work with any set of regulations we are given. Ultimately, society decides that, and we will follow those rules, regardless of the impact that they have on our business”.
These are just two of several tech leaders who have publicly accepted the inevitability of a regulated online world, which should, in theory, make the implementation of regulation passed in this House a collaborative affair. However, no sooner is regulation drafted than the warm words of sector leaders are quickly replaced by concerted efforts to dilute, delay and disrupt. Rather than letting society decide, the tech sector is putting its considerable resource and creativity into preventing society, and society’s representatives, applying its democratically agreed rules.
The committee’s proposal for a digital authority would provide independence from the conflicts built into the DNA of DCMS, whose remit to innovate and grow the sector necessarily demands a hand-in-glove relationship but which also has a mandate to speak up for the rights and protections of users. More broadly, such an authority would militate against the conflicts between several government departments, which, in speaking variously and vigorously on digital matters across security, education, health and business, are ultimately divided in their purpose. In this divide and rule, the industry position that can be summed up as, “Yes, the status quo needs to change but it shouldn’t happen now or to me, and it mustn’t cost a penny” remains unassailable.
The noble Lord, Lord Gilbert, set out many of the 10 principles by which to shape regulation into an agreed and enforceable set of societal expectations, but they are worth repeating: parity on- and offline, accountability, transparency, openness, privacy, ethical design, recognition of childhood, respect for human rights and equality, education and awareness-raising, and democratic accountability. I want to pick up on one single aspect of design because, if we lived in a world in which the 10 principles were routinely applied, maybe I would not have been profoundly disturbed by an article by Max Fisher and Amanda Taub in the New York Times last week, which reported on a new study by researchers from Harvard’s Berkman Klein Center. The researchers found that perfectly innocent videos of children, often simply playing around outside, were receiving hundreds of thousands of views. Why? Because YouTube algorithms were auto-recommending the videos to viewers who had just watched “prepubescent, partially clothed children”. The American news network MSNBC put it a little more bluntly:
“YouTube algorithm recommends videos of kids to paedophiles”.
However, although YouTube’s product director for trust and safety, Jennifer O’Connor, is quoted as saying that,
“protecting kids is at the top of our list”,
YouTube has so far declined to make the one change that researchers say would prevent this happening again: to identify videos of prepubescent children— which it can do automatically—and turn off its auto-recommendation system on those videos.
The article goes on to describe what it calls the “rabbit hole effect”, which makes the viewing of one thing result in the recommendation of something more extreme. In this case, the researchers noticed that viewing sexual content led to the recommendation of videos of ever younger women, then young adults in school uniforms and gradually to toddlers in swimming costumes or doing the splits. The reason for not turning off the auto-recommend for videos featuring prepubescent children is—again, I quote the YouTube representative’s answer to the New York Times—because,
“recommendations are the biggest traffic driver; removing them would hurt ‘creators’ who rely on those clicks”.
This is what self-regulation looks like.
Auto-recommend is also at the heart of provision 11 in the ICO’s recently published Age Appropriate Design Code, which, as the right reverend Prelate said, is commonly known as the “kids’ code”. Conceived in this House and supported by many noble Lords who are in the Chamber tonight, provision 11 prevents a company using a child’s data to recommend material or behaviours detrimental to children. In reality, this provision, and the kids’ code in general, does no more than what Mark Zuckerberg and Jeff Bezos have agreed is necessary and publicly promised to adhere to. It puts societal rules—in this case, the established rights of children, including their right to privacy and protection—above the commercial interests of the sector and into enforceable regulation.
Sadly, and yet unsurprisingly, the trade association of the global internet companies here in the UK, the Internet Association, which represents, among others, Amazon, Facebook, Google, Twitter and Snapchat, is furiously lobbying to delay, dilute and disrupt the code’s introduction. The kids’ code offers a world in which the committee’s principle—the recognition of childhood—is fundamental; a principle that, when enacted, would require online services likely to be accessed by children to introduce safeguards for all users under the age of 18.
The Internet Association cynically argues that the kids’ code should be restricted to services that are “targeted at children”, in effect putting CBeebies and “Sesame Street” in scope, while YouTube, Instagram, Facebook, Snapchat, et cetera, would be free to continue to serve millions of children as they alone deem fit. The Internet Association has also demanded that children be defined only as those under 13, so that anyone over 13 is effectively treated like an adult. This is out of step with the Data Protection Act 2018 that we passed in this House with government agreement, which defines a child as a person under 18. Moreover, in the event that it is successful in derailing the code in this way, it would leave huge numbers of children unprotected during some of the most vulnerable years of their life.
Perhaps the most disingenuous pushback of all is the Internet Association’s claim that complying with regulations is not technically feasible. This is a sector that promises eye-watering innovation and technical prowess, that intends to get us to the moon on holiday and fill our streets with driverless cars. In my extensive conversations with engineers and computer scientists both in and out of the sector, no one has ever suggested that the kids’ code presents an insurmountable technical problem, a fact underlined by conversations I had in Silicon Valley only a few weeks ago. Yes, it requires a culture change and it may have a price, but the digital sector must accept, like all other industries have before it, that promoting children’s welfare—indeed, citizens’ and community welfare more generally—is simply a price of doing business. Let us not make the mistake of muddling up price and cost, since the cost of not regulating the digital world is one that our children are already paying.
Regulating in a Digital World establishes beyond doubt that if we want a better digital world, we must act now to shape it according to societal values, one of which is to recognise the vulnerabilities and privileges of childhood. I recognise and very much welcome the future plans of the Government in this area, but if we cannot get one exemplar code effectively and robustly into the real world, what message does that send to the sector about our seriousness in fulfilling the grand ambitions of the online harms White Paper?
When replying, could the Minister give some reassurance that the Government will indeed stand four-square behind the Information Commissioner and her ground-breaking kids’ code? In doing so, will they meet the expectations of parents, who have been promised a great deal by this Government but have not yet seen the change in the lived experience of their children. More importantly still, will they meet the needs and uphold the rights of UK children, rather than once again giving in to tech sector lobbying?
I will finish with the words of a 12 year-old boy who I met last Thursday in a 5Rights workshop. A self-professed lover of technology, he said, “They sacrifice people for cash. It makes me so angry. I can’t believe that people are so unnecessarily greedy”. His words, remarkable from someone so young, eloquently sum up the committee’s report.
My Lords, first, I thank the committee for its very thorough report and its chairman, the noble Lord, Lord Gilbert, for introducing it so ably and with such eloquence. However, I am one of the few Members who disagree with some of what the report says.
First, it is impossible to regulate the internet in a small nation state such as the UK. The internet is international. It is broad and goes across the whole world. Therefore, it is impossible to regulate it within one country. It may be that this is my anti-Brexit speech, but so be it. The fact is that you cannot regulate the internet in one country and one country only. You have to be part of a broader international scene to do that.
Secondly, there is a danger in overregulation of the internet, in that it stifles innovation. Innovation is at the core of all that we do in this matter. On balance, we are probably looking to overregulate the internet in this country—in this country only—and some of these big international companies will simply move elsewhere rather than stay here. Certainly, we must be very wary of overregulating the internet if as a result we stifle innovation, which is so important in the modern world.
Thirdly, if anything, the balance on the internet is in favour of the internet. More good comes out of it than harm. I think the report is negative, to some extent, in that it tends to go overboard on what is wrong with the internet, rather than telling us what is right about it. For instance, I do all my banking—or nearly all of it—on the internet. I do not go to the bank. When I went to my own bank branch recently, which has now closed, I looked around and said, “Oh, you’ve done this up”. One of the clerks said, “Yes, five years ago, Mr Maxton”. I have a Bank of Scotland app, with all of my bank accounts. I transfer money from one account to another, pay by BACS and pay on the internet. When I put my card into a machine at a bank, in a shop or wherever it might be, that too is the internet at work.
Most of the apps I use are simply there to provide a service. I read on a Kindle; I do not read books any more. A lot of authors are now bypassing publishers, going straight to Amazon and asking to write for it directly. If they go to Amazon, they get a greater return. The price is lower than a book, but they do not have to pay a publisher, a bookseller or all sorts of people to advertise it. It is advertised by Amazon and their return is higher. I group my websites and I have three golf clubs, a running club and a rugby club on my apps under “sport”.
Lastly, I say to everybody who produced this report that the one thing that has not been mentioned is “school” or “education”. Perhaps it was briefly mentioned in the report, but schooling is important. Surely that is where this ought to begin. We ought to start there by telling children how to deal with the internet. Instead, we tell them how to make computers, and a small proportion of them may be able to do that. The fact is that we do not tell them about the dangers the internet possibly has—I stress “possibly”. I will finish there, because I am very aware that we want to finish quickly.
My Lords, like other speakers, I congratulate the noble Lord, Lord Gilbert, and his committee on this useful report. I say at the outset that I am a director and trustee of Full Fact. It has been a great pleasure to discuss Communications Committee matters with many Members of your Lordships’ House, as I had the great privilege of doing that years ago when I was fortunate enough to chair the committee; those were some of the happiest times of my life in this House.
To make this report manageable, it was really sensible to set on one side many of the technical aspects of the internet—not least because I for my part cannot understand them—and a number of the aspects and implications of the phenomenon of large-scale data transfer in the context of areas such as the internet of things. Rather, we have a report that seems to concentrate on the relationship of the internet with individual human beings essentially in a personal capacity.
In this context, the internet is a complicated and potentially complicating intermediary between two separate things—a source of information and its consumer—which in turn may well involve commercial transactions and/or marketing of products or services in a way which, until recently, was unthinkable. I say as an aside that this may be one of the most abstract House of Lords reports I have read. This is not a criticism; it is a symptom of the difficulty of the problem we are looking at, which, in its domestic form, is merely part of a wider global problem which is embedded here in the United Kingdom—this is picking up on a point made by the noble Lord, Lord Maxton. We are talking about a fast-moving and ever-changing technology, and the techniques used to apply it, set in a global system.
I think this is the right starting point, because the report rightly comments that a “principles-based approach” is probably the only way to put in place a remotely relevant framework in such a fast-changing environment. From this starting analysis, it is important that the relationship between government—not only our Government—and those who effectively have the greatest influence and control over the net, the FAANGs of the moment, amounts to a reasonably amicable modus vivendi, each understanding the role of the other. The latter, who can organise much of their activity beyond the reach of any traditional Government, must retain consumer and public confidence, while Governments must try to ensure the benefits of this new technology are maximised for their citizens’ benefit.
It follows that Governments in the western world, many outside it—excluding certain authoritarian regimes—and the big tech companies, taking a longer view, have a mutuality of interest in working together. As Tim Berners-Lee, who was quoted at the beginning of the report, has said, the internet is potentially a great force for good. One should add to that proposition that it is one that is not going to go away. On the one hand, Governments need to enlist internet players to assist in dealing with human, financial, political and reputational harm, as well as bettering the human condition more generally. At the same time, internet businesses, by being involved in that, can expect a regime in which they will get a reasonable return—on which appropriate tax should be paid.
An important point contained in the report is that there should be as seamless a join as possible between online and offline rules of behaviour and conduct, although clearly there are some attributes unique to the internet, such as algorithms, which may require their own rules. Rules must be not only enforceable; where appropriate, they must be enforced.
As I have mentioned, while some of the problems which arise are essentially domestic, many are not. Sometimes those which are not are less obvious; for example, some of the corruption we have seen in domestic UK elections of late. Natural boundaries and traditional jurisdictions are in many cases irrelevant. Our country, like every other, must accept that.
A shortcoming of the report, which may be deliberate—for reasons I have touched on already—is the almost complete silence about the cross-border, cross-jurisdictional aspect of the internet’s workings. The noble Lord, Lord Maxton, pointed this out, and allusion is made to it in the text. As the report points out, leaving the EU presents some real difficulties in this regard. They can, and no doubt will, be negotiated and dealt with in another way in a post-Brexit world, but the problem has always gone much further than the European Union. Collectively, Governments of the world must somehow evolve a universal and consistent framework for the modus operandi of the internet, as viewed from the perspective of consumers and enforcers, although exactly how that might be done is beyond my pay grade at the moment.
I think there are two parallel problems. First, jurisdictional difficulties need to be set aside to try to achieve some kind of workable international transactional homogeneity. As part of this, those who influence the way in which the internet works need to be brought into the rule-making process. This is not the traditional approach to lawmaking within the nation state, but there is a need for a system which brings about an agreed and accepted outcome into which there is general buy-in, subject in the last analysis to effective enforcement. If there is not, it will not work. Secondly, however that may be brought into being, the arrangements must be living, or they will rapidly become outdated, as has already been said.
Currently, in the midst of the Brexit crisis—if I may describe it that way—much of the focus of the debate has been on the legislative structure of the European single market. The system that emerges may make that look positively simple. If so, so be it, because unless we grasp this particular nettle, the likely outcome is an anarchic muddle.
At the start of my remarks, I commented on the somewhat abstract character of the report, only to then make a generalised and somewhat abstract speech myself. However, it seems to me that if a requirement of a regulated, working internet is to achieve its full potential in the wider public interest, it must be brought into being as a result of some quite radical actions, and radical thinking is required to do that. This is not simply a domestic issue which relates to domestic activities. It goes much further than that. International problems require international solutions.
My Lords, the hour is late and everything that needs to be said has been said, but not yet by me. However, your Lordships will be happy to know that that is the way it is going to stay because I really just want to emphasise one issue, which has been widely addressed by other contributors to the debate. I thank our chair, the noble Lord, Lord Gilbert, who led this committee with tremendous grace. I will not say that it was made up of cats or that it was especially difficult to herd its members, but it had its challenges, as the noble Baroness, Lady Kidron, said. We also had fantastic support from the clerks and our adviser, as has also been said.
I also thank the Government for responding so promptly to the report, which allowed this debate to happen while its findings were still current. This has not been the case on every occasion, and in this particular realm there is a need for issues to be addressed quickly because otherwise they are not the issues today that one thought they were yesterday.
Digital technology is not my area of expertise, so I have learned a very great deal more from witnesses and colleagues than I have been able to contribute. I have discovered, however, that there is some value in being a relative innocent in the digital realm. The value to me was that I have had to work jolly hard to understand what was being put in front of me. I do not think that I have always understood all of it, but I have certainly understood something. The main thing that I have understood is blindingly obvious: the digital world, referred to in the title of the report, is not a parallel universe that we can step into or out of at will. It is the world. It affects and infects every aspect of our lives, whether we like it or not. I will simply give the House a few obvious examples. It affects our politics and our democracy; it affects the way we buy and sell things; it affects the way we access public services and medicine, and it infects and affects our domestic and private lives.
I do not know how many noble Lords are watching the BBC TV series “Years and Years”, written by Russell T Davies. It is an absolutely brilliant piece of dystopian imagination. Threaded all the way through it is the dependency on digital technology, which every single person who is part of the world it is describing—which is only a few years on from today—has to recognise. The wonderful thing about it, apart from the brilliant writing and performances, is that some of what can be seen in it is clearly not exactly what we have today but so close as to be recognisable. I mention it because it tells us that it does not take very much—of course, Mr Davies’s imagination is a good deal more far-reaching than all of ours—to realise just how close we are to that kind of really deep-rooted dependency.
I was part of a conversation earlier today, as part of my work on the committee, with a group of 17 and 18 year-olds—year 12, in other words—who came to talk to us about their viewing habits and how they accessed television. Of course, what they described was a way of working with the technology that they have available to them. This is certainly quite different from the way that I work with what I have available to me because they are completely familiar with it. They understand the way that it works and the opportunities that it offers to them. These young people were using this technology very creatively; they were very clever and savvy and healthily sceptical about what was put in front of them. However, they need and deserve effective regulation and, furthermore, they know that they do. The contributions from the right reverend Prelate and the noble Baroness, Lady Kidron, made these points very effectively, more effectively than I can.
Given this reality and given the world, it must surely be the case that effective regulation would not just be nice to have: it is absolutely essential. It is fairly clear that self-regulation, which has been depended on up till now, is inadequate; and that the nature of regulation itself has to be rethought, which was the point made at the very outset of this debate by the noble Lord, Lord Gilbert. It has to be rethought with far greater emphasis on working collaboratively across boundaries and sectors. This is the rationale— which was so expertly analysed by the noble Baroness, Lady Harding, in her contribution—behind the committee’s recommendation that a new digital authority be set up.
Like others, I welcome the recent online harms White Paper and I am glad that the Government are broadly sympathetic in their response to the committee’s analysis and recommendations. However, I note that their response to this key recommendation is what might be called a bit lukewarm. This recommendation on the digital authority suggests that a single, overarching co-ordinating body, linked to a Joint Committee of Parliament, is potentially the most effective way of ensuring regulatory coherence in the fast-moving world of technological development.
To be fair, the Government’s response accepts the need for,
“a coordinated and coherent approach across the various sector regulators and bodies tasked with overseeing digital businesses”,
but it then sets out a rather less than coherent way forward:
“As part of this programme of work, we look to the tech sector, businesses and civil society, as well as the regulators themselves, to own these challenges with us, using our convening power to bring them together to find solutions where possible”—
I emphasise “where possible”. Later it says:
“The government is carefully considering potential overlaps between new regulatory functions, such as that proposed through the Online Harms White Paper, and the remits of existing regulators. Consolidation of these functions, or a broader restructuring of the regulatory landscape, could”—
again, I emphasise “could”—
“play an important role in supporting an effective overall approach to the regulation of digital, as well as minimising burdens on businesses … We thank the Committee for their recommendation and will carefully consider this and their other recommendations as we continue to assess the need for further intervention”.
In one way there is nothing wrong with that, but I do not detect any great sense of urgency. Speaking just for myself, I think these matters are urgent. Actually, I think the committee thinks so too, and the report says that. I fear that we are in danger of being completely outrun by the speed of change. I realise that the Government are in a difficult place at the moment, and I do not say that disrespectfully, but while they are pausing to sort themselves out our digital world is moving on apace, and it will not wait for us. I hope the Minister can assure us that the necessary momentum will gather before it is too late.
My Lords, digital regulation is an incredibly complex subject, as we have heard, and it covers a wide range of diverse areas, so I am very grateful to the committee and to the noble Lord, Lord Gilbert, for producing this comprehensive report. I will focus tonight on data, and I apologise now to the noble Lord, Lord Inglewood, because I am going to get a little bit into the nuts and bolts. In doing so, I am going to concentrate principally on Google, but some of the issues that I raise apply to a greater or lesser extent to other platforms.
Google is the world’s largest digital advertising company but it also provides the world’s leading browser, Chrome; the leading mobile phone operating platform, Android; and the dominant search engine. Its Chrome- book operating system, while smaller, is growing fast, and it offers myriad other services to the consumer, such as Gmail, YouTube, Google+, Maps, Google Home and so on. These services are mostly provided to the consumer for free, and in return Google uses them to collect detailed information about people’s online and real-world behaviour, which it then uses to target them with paid advertising.
It collects data in two principal ways. First, active collection is where you are communicating directly with Google —for example, when you sign in to a Google account and use its applications. When you are signed in, the data collected is connected to your account, in your name. Secondly, Google applies a passive-collection approach. This happens when you are not signed in to a Google service but the data is collected through the use of the Google search engine, and through various advertising and publishing tools that use cookies and other techniques to track you wherever you go on the net—or indeed physically. It can still track your device location even if you are not an Android user. Do not think that avoiding all Google software will help; most websites have Google tools embedded in them and will place Google cookies on your device regardless.
The sheer quantity of data that Google collects every day is staggering. A recent study by Professor Douglas Schmidt of Vanderbilt University simulated the typical use of an Android phone and found that the phone communicated 11.6 megabytes of user data to Google per day—that is just one device in one day. As an aside, the phone is using your data allowance; you are paying for it to send all this data back to Google. The experiment further showed that even if a user does not interact with any key Google applications, Google is still able to collect considerable data through its tools and by using less visible tracking techniques.
The greatest safeguard over the collection and use of data has to be transparency. As users, we need to understand what is being collected, by whom and what for, and we need the ability to stop it and delete it if we wish. The GDPR and the Data Protection Act represent a step forward but it is already becoming clear that they may not be sufficient for the fast-moving digital world. How many people really understand what Google or indeed any other platform is collecting about them? This is going to become even more important as 5G and the internet of things take off.
As part of the right to be informed under the GDPR, websites now need to ask consent to use cookies. However, as we have all seen, the consent pop-ups usually say something general such as, “Cookies are used to improve and personalise our services”. It remains very difficult to find out precisely what data is being transferred, to whom and why. This is then complicated further by the fact that accepting cookies on a site usually means accepting not only the cookies for the site concerned, but also third-party cookies, including Google. Amazon, for example, lists 46 third parties that may set cookies when you use Amazon services, with no clear explanation of what each is doing, or what the relationship is. This is not transparent. Remember, cookies are only one way to collect the data. Others are less visible, such as browser fingerprinting. Blocking cookies does not stop data being collected.
GDPR also gives us the right to obtain the data that is held on us, but there are a number of problems. First, it is hard to know who has your data, because of the many third parties I have spoken about, with which you have no direct relationship but are collecting data on you. Secondly, only data deemed personally identifiable will be provided. In Google’s case, this includes only the data that it has collected using the active process I described earlier when you are logged into a Google service. However, as Professor Schmidt’s study showed, the majority of the data Google collects comes from the passive collection method. This data is described as user-anonymous, being linked to different identifiers, such as your device or browser ID; but if you log into a Google service from the same device or browser, either before or afterwards, Google is able to link it to your account.
Thirdly, as the committee’s report points out, the data that must be provided in response to a request does not include the behavioural information that derives from your data. I strongly agree with the committee’s conclusion that this behavioural information should be made available to the subject. I further urge the ICO to look more closely at whether cookie consent requests really meet the right to be informed, and to consider whether data that the platforms describe as user-anonymous are really anything of the sort. There should also be a requirement to provide details of any data that has been provided to third parties, and to provide details of third parties that have been allowed to collect data through one’s website. Does the Minister agree with these suggestions?
The second issue that arises from the way data is collected is one of conflict of interest and market power. I have described the volume of data collected by Google. This is hugely facilitated when the operating system and browser of your phone or computer is provided by Google. In effect, this means that your device is not working for you or protecting your interests; it is working for Google, helping it to obtain your data. Google’s dominance in both browser and phone operating systems strengthens a network effect that has assisted its rise as one of the data monopolies, making it hard for others to break into the market and compete. There has been talk of splitting up these data monopolies, and there must be an argument for somehow separating the activities of providing operating systems and browsers from those of data collection and advertising. At the very least, we should insist on mandatory standards of user protection and transparency to be built into such operating systems and browsers. Doing this would ensure that the software works to protect the interests of the user, not the interests of the advertiser. This would be a strong step towards,
“data protection by design and by default”.
I continue to agree that the CMA should look into the digital advertising market, as repeated in the report, and urge that this structural conflict I have just described is considered as a part of that. I am very sorry that the noble Lord, Lord Tyrie, has had to pull out of this debate. It would have been very good to have heard what he had to say on the subject. I urge the Minister to encourage the CMA to take a look.
In conclusion, I have suggested that the ICO should look into one element and that the CMA should review another—both elements are related. I think this emphasises the need for an expert digital authority, as the committee recommends, if only to act as gatekeeper and make sure that issues do not fall between the cracks.
My Lords, the noble Lord, Lord Vaux, need not apologise. This is one of the few assemblies in the world where one would get as deep and thorough an analysis of the subject from one of its Members. I still remember the American Senate talking to Mark Zuckerberg and the chasm of understanding between the legislators and the techie was cruel to behold. So stay with us.
I want to refer to a comment by the noble Lord, Lord Inglewood. He talked about his chairmanship of the Communications Committee. I have never served on that committee or been its chairman, but for nine years I was leader of the Liberal Democrats here in the Lords and in that capacity I was on all the committees that looked at the structures of committees, et cetera. I can say that during that time there were one or two very severe attempts to get rid of the Communications Committee, usually by offering even more interesting things to members. It was something I seriously resisted, because I believe that its ambit covers such an important future agenda that it is important that it continue as a permanent committee of this House. Its importance is underlined by the report before us tonight and I congratulate the noble Lord, Lord Gilbert, both on the way he introduced it and on the way he herded the cats on the committee, as we were told. He had my noble friends Lady Benjamin and Lady Bonham-Carter as members, so I know exactly what he was talking about.
The committee has already had its impact: the Government have acknowledged that their online harms White Paper was influenced by some of the committee’s recommendations. Some 16 years ago I served on the Puttnam committee, the pre-legislative scrutiny committee for what became the Communications Act 2004. That Act created Ofcom, which has developed into a feared and respected regulator with public interest responsibilities. That committee took the conscious decision 16 years ago not to look into the idea of regulating the internet. The world wide web was seen as a free good and a boon to mankind. Ten years later, in addition to that libertarian approach, was the argument that the internet titans, the likes of Facebook, Amazon, Netflix and Google, were now so global and powerful as to be beyond the reach of any national jurisdiction—what I would describe as the Maxton approach.
Now the public mood has changed. As the noble Lord, Lord Gordon, said, that sense of wonder and awe has worn thin. In the United States, in Europe and here in the United Kingdom there is now a feeling that we have got to come to grips with the power of the internet. The chair of this committee, the noble Lord, Lord Gilbert, when launching this report said:
“A comprehensive new approach to regulation is needed to address the diverse range of challenges that the internet presents”.
Tonight, he called for urgent and compelling action. Tim Berners-Lee, the father of the world wide web, has said:
“While the web has created opportunity, given marginalised groups a voice, and made our daily lives easier, it has also created opportunity for scammers, given a voice to those who spread hatred, and made all kinds of crime easier to commit”.
The noble Baroness, Lady Kidron, quoted Mark Zuckerberg and other tech leaders as saying that they would now welcome some regulation, but I give a warning: do not underestimate the power of the lobbyists. The so-called FANGs have immense resources. I saw in the New York Times this week that even the Senate was backing off from too urgent action against them. In some ways, the story of the National Rifle Association should always be kept in mind if you are really challenging vested interests in a big way and, boy, that is what we are proposing to do.
The great debate is now about how and when we regulate. Both the committee report and the Government’s White Paper, along with many contributions to today’s debate, listed the harms and abuses that the internet has spawned—although I acknowledge along with the noble Lord, Lord Maxton, the many benefits that the internet has spawned as well. A few weeks ago the Health Minister was answering questions about the mental health damage to young people on the internet. She made the point in response, which I thought was very valid, “Yes, but also on the internet is found some of the help and advice that young people were often searching for, which they would not be able to find as easily elsewhere”.
We are talking about a balance, but the grooming and abuse of vulnerable groups, particularly children, is nevertheless one of the key things, and I pay tribute to the campaign that the noble Baroness, Lady Kidron, has led on this. As far as the kids’ code is concerned, all I can say is that we will be with her every step of the way, so she should keep going. There is of course use by terrorists, organised crime and, indeed, state agencies. There are also the undermining of democratic processes and the promotion of hate language towards race, sexual orientation and mental or physical handicap. The noble Viscount, Lord Colville, mentioned other health and social consequences, particularly with gaming addiction. The examples go on and on, and such a charge list creates a public and political demand that something must be done. The White Paper captures this sense of urgency when it says that things,
“have not gone far or fast enough”.
Our task is made easier by the committee’s recommendation of 10 principles to guide the development of regulation online. On the other hand, the recommendation that a new digital authority be created sets alarm bells ringing at the idea of yet another regulator in this sphere. We need to think carefully about what is needed. Such an authority will need a certain heft and clout to gain the respect of some pretty big beasts.
I remember that when the Puttnam committee was discussing the establishment of Ofcom we were told that Murdoch’s lawyers would eat this new regulator for breakfast. Well, it was not so. Now, 15 years on, we have reached a stage where “give it to Ofcom” seems to be the answer to every problem. That may be the answer, but let us weigh up the options. Whatever becomes this digital regulator will have to work closely with the ICO, the CMA and other bodies such as the Centre for Data Ethics and Innovation, as well as self-regulators such as the ASA and trade bodies such as the Internet Association. But Parliament will then have to decide where the buck stops and who makes the key decisions.
There will also need to be early work on data literacy. Here I agree with the noble Lord, Lord Maxton, that the long-delayed recommendation of the Puttnam committee for a clear policy of data literacy education is important, parallel with these developments. In addition, the CMA and the DCMS are going to need extra resources to take on their new responsibilities. I hope that I am not treading on too many toes in Whitehall if I say that there will be greater public confidence as we move forward if the DCMS is seen as the lead department, although of course the Home Office has a clear role in criminal, terrorism and intelligence matters.
I disagree with the statement that the DCMS cannot be the poacher and gamekeeper. The digital authority will have to have a parent department, but Parliament will need to be able to look at some detailed and specific proposals if we are to avoid a plethora of codes and regulators and a balkanisation of the system, a warning made by the noble Baroness, Lady Harding. I thought at one point that she was going to suggest that the whole lot be given to the Home Office, but she steered away from that nightmare. That is why it is not nostalgic for me to urge that, before we move to specific legislation, a draft Bill is submitted to a joint pre-legislative scrutiny committee of both Houses. The great benefit of the Puttnam committee process was its transparency and its open door to allow all interest groups to have their day in court. The outcome was a piece of legislation which was better and more robust because of that pre-legislative scrutiny. I am very interested to see that growing into a permanent Joint Committee of both Houses.
The noble Baroness, Lady McIntosh, mentioned democracy. One of the criticisms of the White Paper and the report is that they did not deal with the threat to our democracy posed by internet abuse. I am delighted to see on today’s Order Paper that a Committee of this House has been established to report on democracy and digital technologies. I was even more delighted when I saw that the noble Lord, Lord Puttnam, had been appointed chairman. I hope the Minister will assure us of his department’s full co-operation with the work of that committee.
My final appeal is that we remain major players in international discussions on these matters. Between 2010 and 2013, I was the Minister involved in the early stages of GDPR negotiations. The GDPR may have its weaknesses, but it is an example of how international agreements can be reached on these matters. In the ICO and its commissioner, we have a real asset to be deployed in seeking international co-operation. I agree with the noble Lord, Lord Inglewood: I see no reason why we should not have the ambition to create a kind of Geneva convention on rules of behaviour for the world wide web.
Nor for the first time, the Communications Committee has produced a report which brings credit to this House and positive and useful advice to the Government, while providing clear advice for the next steps for all of us in this complex and fast-moving world. In that respect, we are all in its debt.
My Lords, I declare that I was once very briefly a member of the Communications Committee, I think before the noble Lord, Lord Inglewood, took the Chair, although there was a point where he did appear in the Chair. I am not quite sure why that was, but it sticks in my memory. I therefore speak personally of the skills and expertise that have often gathered around that group.
We all owe a debt to the noble Lord, Lord Gilbert, for introducing this report. To say that it is a powerful and useful report is to repeat what a lot of people have said. However, the test is whether the members of the committee rally round and support it, and we have had a brilliant demonstration of that today. It is clearly a well-functioning and powerful group, but it has picked a topic of considerable importance and brought forward something which has made the whole House think again. The excellent speeches and the good debate we have had tonight are only part of the process. The report itself is a very good read. It may be abstract, but it certainly hits home.
The Government’s response was unusually prompt, but DCMS has a good record on this—certainly better than a lot of other departments. However, I felt, like others, that it was a bit defensive. It claims that the committee’s recommendations are closely aligned with what the Government are doing, although, as we have heard, the committee feels that it goes much further. It argues that the issues are covered in the online harms White Paper, but if they were not, they would be picked up by Centre for Data Ethics and Innovation—talk about having it both ways. We will see how that goes. Is it true that the centre is not yet established as a statutory body? If so, will the Minister explain how it will provide independent expert advice on the measures needed if it remains an NDPB within his department?
I shall argue tonight that if, as the Government say in their response, it is clear that they must lead the way in tackling these challenges and there really is firm commitment to do what is needed, they need to be prepared to take on vested interests so that they can shift expectations of behaviour, agree new standards and update our laws, which is what they say they want to do.
Several members of the committee, perhaps reflecting their own contexts, have expressed concern about the Government’s commitment here, but I put it to the Minister that the Government should use this excellent report as a spur to further action. I suggest that the best way forward, as the noble Lord, Lord McNally, said, is to publish a draft Bill and allow it to be subject to pre-legislative scrutiny. That way, we can see what is happening, get the transparency we need and pick up the comments and expertise required.
We have a White Paper, which in common parlance means that a Bill is in prospect or might be in preparation—perhaps the Minister will confirm where we are on that. The Government and the committee certainly agree that the centrepiece of the new approach should be, as the Government propose, tripartite. It is a significant and welcome decision of the Government to legislate to establish a new statutory duty of care to make companies take more responsibility for the safety of their users online and tackle the harm caused by content or activity on their services, combined with legislation to ensure compliance with this duty by establishing an independent regulator with powers to implement, oversee and enforce any regulatory framework. Most importantly, the third leg of the stool is to create a new form of regulatory intervention which will help companies to thrive, while ensuring the safety of users promoting innovation, guaranteeing freedom of expression and establishing other norms that underpin our democratic society—the democracy element is very important.
The reason that is so interesting is that it is a tripartite and interlocking approach. Like the committee, I broadly agree with what the Government are trying to do in ensuring that digital technology and the internet work for everyone—citizens, businesses and society as a whole. But there is far too little in the response to the committee to back up the Government’s assertion that the new system will answer the committee’s concerns that new technology will be deployed ethically as well as safely and securely, or that consumers will have the powers they need to ensure that their rights and views are not ignored, as they are at present, which is why the committee’s report is so important.
We all owe the committee a debt of gratitude for its work in setting out so comprehensively the challenges that the new regulatory environment will face, and the comments made by speakers today have been most useful in fleshing out the issues. How could it be otherwise, given that the skills, knowledge and experience represented on the committee are so incredibly useful?
I join several previous speakers in suggesting more action from the Government. I shall mention three of the committee’s recommendations which seem to me to have real merit, but which the Government seem to have downplayed. Like my noble friend Lady McIntosh—who is wearing three hats today—I felt that the Government’s response did not quite convince the neutral witness that they have the momentum, as I think she put it, to see this job through to the end. As I said, there is a test, which is the publication of a draft Bill.
First, on the smarter regulation proposal—the centrepiece of the speech of the noble Lord, Lord Gilbert, and the first point raised by him—the committee said that we need not more but different regulation for the internet. I agree with that. In paragraph 240, it comes up with a very interesting idea which fleshes out that concept. As the noble Lord said, the Government should establish another body with additional powers to ensure that digital regulation, wherever it happens, is kept up to date and in step. It has called it the digital authority and has listed the powers that it might have, aimed at co-ordinating regulation and regulators in the digital world.
There are very few new ideas in public policy, but I wonder whether this is one. There is the germ of a very good idea here, and I hope that the Government will take seriously the case for creating a body with powers to instruct other regulators to address specific problems or areas in the digital space. In cases where that is impossible because the problems are not within the remit of any one regulator, the digital authority should be well placed to advise the Government and Parliament of new or strengthened legal powers which are needed. The suggestion of combining this with a standing Joint Committee of Parliament is a very good one; that seems to square that circle very well.
Turning to the principles underlying regulation, the committee makes a very good point, which is that there should be a much more explicit set of principles underwriting the way in which any regulation applying to the internet should work. This may answer some of the points made by my noble friend Lord Maxton and others about the need for universal appeal for this, because if the principles are well constructed, they will be beyond any particular national boundary; they will be strong enough to go across them.
The 10 principles which the committee says should guide the development of the regulation have already been discussed by both the noble Baronesses, Lady Harding and Lady Kidron, but they bear repeating: parity, accountability, transparency, openness, privacy, ethical design, recognition of childhood, respect for human rights and equality, education and awareness raising and democratic accountability. This is a very powerful group of principles, which, if they are taken properly and put into words which apply to those who have to operate in this space, will bite. The Government say that the six principles they have specified in their White Paper, are,
“closely aligned with those set out in this report”.
As the noble Baroness, Lady Harding, said, they are not exactly similar, and there are three important gaps. There is no mention of accountability: the processes that need to be in place to ensure individuals and organisations are held to account for their policies and actions. Nor is there mention of transparency: how we will see into the businesses and organisations operating in the digital world so that they are open to scrutiny—this very strongly picks up the point about algorithms. The other gaps are democratic accountability, which was picked up by the noble Lord, Lord McNally, and proportionality and evidence-based approaches. There may be ways in which these words appear in the Government’s list, but the fact that they have been drawn out in the committee report is important, and we should not lose that.
Market concentration was raised by a number of speakers. The report makes two important points that the Government have not picked up on well. The first is on the way in which the internet operates specifically against the public interest, with large companies becoming data monopolies, mainly through mergers and acquisitions. The committee recommends that, in their review of competition law in the context of digital markets, the Government consider implementing a public interest test for data-driven mergers and acquisitions, so that the CMA can intervene, as it currently does in cases relevant to media plurality or national security. I agree with this. Secondly, the internet is characterised by a concentration of market power in a small number of companies that operate online platforms and values brands, platforms and other issues that are not well recognised within the physical world. The Committee make the point that these aspects of digital markets challenge traditional competition law and it suggests that Government broaden the consumer welfare standard to ensure that it takes adequate account of long-term innovation and strengthens the power of the CMA to bring the process of imposing interim measures up to date and make it more effective. I think this is something that the noble Lord, Lord Tyrie, has already proposed, so the Government may be able to respond to.
Other speakers have picked up that the government response here is rather weak:
“We continue to consider policy options across the range of measures proposed”.
But the independent Digital Competition Expert Panel led by Professor Jason Furman published its recommendations for government on 13 March 2019, so there has clearly been plenty of time to pick this up and bring forward proposals. There needs to be legislative change here, so why not put this in a draft Bill since we already have the proposals?
I do not think anybody has picked up on the elephant in the room: the e-commerce directive. I think that is partly because it is complicated and made more difficult by Brexit. The point made by the committee is important: online communication platforms are utilities, in the sense that users feel they cannot do without them. As the report points out, the providers of these services have a safe harbour at the moment under the e-commerce directive. What are the Government going to do about that? I ask the Minister to pick up this point in particular. If we are staying in the single market, this would have to be done conjointly with the EU, and there are measures afoot to try to do something here. If we leave, we will have some flexibility. Can the Government share its thinking on this issue?
Finally, on my list of actions for the Government: content moderation. Again, this has not been picked up very strongly, but perhaps we have just become so used to it that we are unable to think again about this. One of the greatest frustrations of the internet is that the powers to remove content that is either illegal or causes harm are so ineffective—in paragraph 224 the Committee adjures for this. One problem is that major platforms have failed to invest in their moderation systems, leaving human moderators overstretched and inadequately trained. AI is also not proving effective. There is little clarity about the expected standard of behaviour, and little recourse for a user to seek to reverse a moderator’s decision. I worry that relying on a new duty of care is not enough. What we also need is a much stronger consumer right, backed by a regulator who has the power to require action when users have genuine concerns. Will these new powers be considered?
I end with three smaller points, but which are still important. Two or three speakers in the debate were concerned about data acquisition and the need for the publication of an annual data transparency statement. I absolutely agree with that. There is something here that we are not picking up. The Government do not do credit to this important recommendation and it is surely not sufficient to rely on the fact that this information should be set out clearly in a privacy notice.
The noble Viscount, Lord Colville of Culross, picked up the issue of addiction and made a very strong case. There are clear worries about how people become addicted to the internet in a way that has not yet been picked up well, although there are now some changes from medical authorities on this. We need to learn from the failure so far to deal with gambling addiction and gaming addiction. What is suggested in the paragraph is not going to solve this crisis, but it is a start. Voluntary efforts by the companies responsible for the problem is not the way forward. Will the Government look at this again?
Finally in this group, I turn to the matter of algorithms, which have already been touched on. How do you discover which algorithms are being used, what they are doing to your data and how is that going to work? We spent a lot of time on this when considering the Data Protection Bill. Had the noble Lord, Lord Clement-Jones, been here for the debate, I suspect that we would still be talking about it, but I am sure that the Minister is well rehearsed in the arguments. I look forward to a positive response. Something needs to be done here, but the Government are ducking the issue and are not doing well.
The Government are fond of saying that their White Paper is world-leading in terms of laying down statutory rules for the internet, but this report and our excellent debate tonight show that a bit of a gap is emerging between the rhetoric and the likely reality. I hope that I am wrong and I hope that the Minister can reassure us. Backing the kids’ code would be a start, but accepting the idea of bringing forward a draft Bill for consideration would be the way forward.
My Lords, I am grateful to my noble friend Lord Gilbert for introducing the debate and to the entire Communications Committee for its report. I think that it is clear and well thought through. I also thank all other noble Lords who were not on the committee but who have given us their views. This is an interesting area and the thought that has gone into the report is a tribute to noble Lords. However, plenty more needs to be done. As the report notes, the digital world plays an ever-increasing role in all aspects of life. The noble Lord, Lord Maxton, referred to that. As well as benefits and opportunities, this development has brought with it new challenges and risks. The noble Lord, Lord McNally, quoted Tim Berners-Lee in that respect. I think that the committee’s report is closely aligned with, although absolutely not identical to, the Government’s approach. I will explain some of the areas that we are considering and some where we do disagree.
The recently updated digital charter, which was also described as a digital work plan—it is that as well—is our response to the opportunities and challenges arising from new technologies. The committee’s report sets out 10 principles to shape and frame the regulation of the internet which resonate with the six principles that we set out in the charter. I will come back to those principles later. At this point I have to say that I do not agree with some of what the noble Lord, Lord Maxton, said. I believe that it is possible to regulate as long as it is sensible and proportionate. Indeed, Sir Nick Clegg has asked for reasonable regulation, as has been reported today in the newspapers. My Secretary of State has been to discuss this with Facebook and other tech companies in California. Where I do agree with the noble Lord and with my noble friend Lord Inglewood is that co-operation with international bodies is eminently desirable and will be useful. I personally have spoken about this at the G7, the D9, the OECD and the EU Council, and that was just me, let alone the Secretary of State and the Minister for Digital and the Creative Industries. We want to work with our like-minded international partners to determine how we can make the internet a safer place while protecting the fundamental rights and values on which our democracy is based. I can say that other countries are interested in our work in this area. I agree in a way with the noble Lord, Lord Stevenson, that we should not say too often that the work is world-leading; we ought to let other people tell us that.
The principles of the digital charter underpin an ambitious programme of work to ensure that the internet and digital technologies are safe and secure, are developed and used responsibly—with users’ interests at their heart—and deliver the best outcomes for consumers through well-functioning markets.
I will now set out in more detail some of the key areas of work that correspond to the committee’s recommendations. My department and the Home Office recently published the online harms White Paper—which virtually every noble Lord mentioned—setting out our plans to make the UK the safest place in the world to be online. I believe that the suggestions in that White Paper satisfy the committee’s 10 principles.
Illegal and unacceptable content and activity are widespread online, and UK users are concerned about what they see and experience on the internet. The balance that needs to be struck—this conundrum, if you like—was outlined by my noble friend Lady Harding. We agree with the committee that a duty of care is an effective response to tackle this problem. We intend to establish in law a new duty of care on companies towards their users, overseen by an independent regulator, on which we are consulting. As a result of that, as the right reverend Prelate said, tech companies will have to have responsibility. It will leave them in no doubt that internet companies have a responsibility in scope. We believe that this can lead towards a new, global approach to online safety that supports our values, as I said, but also promotes a free, open and secure internet. Speaking of democratic values, I also look forward to the ideas of the House of Lords special inquiry committee on democracy and digital technologies—chaired by the noble Lord, Lord Puttnam —which the noble Lord, Lord McNally, mentioned. I can confirm that, as always, DCMS will give it its utmost co-operation.
As the report identifies, organisations increasingly collect and use individuals’ personal data online. The noble Lord, Lord Vaux, gave us helpful detail on that. New technologies must be deployed ethically, as well as safely and securely. The Government take both the protection of personal data and the right to privacy extremely seriously. The GDPR and the Data Protection Act provide increased regulatory powers for the Information Commissioner’s Office, which strengthen our data protection laws to make them fit for the digital age.
However, the increased use of personal data with artificial intelligence is giving rise to complex, fast-moving and far-reaching ethical and economic issues that cannot be addressed by data protection legislation alone. In answer to the questions from the noble Lord, Lord Vaux, relating to Google in particular, I will look at those details again. It is fair to say that people can contact the Information Commissioner’s Office if they are worried about the use of their personal data by tech companies that may or may not be in compliance with the GDPR.
The Government have also set up the Centre for Data Ethics and Innovation to provide independent, impartial and expert advice on the ethical and innovative deployment of data, algorithms and artificial intelligence. In answer to the noble Lord, Lord Stevenson, this has not yet been set up on a statutory basis—as I think he well knows—but it will be. It is a question of legislative time, but it is our intention and plan to do that. In the meantime, as he knows, the Chancellor has made money available for it to act. It will work closely with regulators, including the ICO, to ensure that the law, regulation and guidance keep pace with developments in data-driven and AI-based technologies. The issue of the forward-looking aspects of the digital authority will partly be addressed by the Centre for Data Ethics and Innovation, but I will come back to the digital authority in a minute.
As set out in the online harms White Paper, creating a safe user environment online requires online services and products to be designed and built with user safety as a priority. We will work with industry and civil society to develop a safety by design framework.
The noble Lord, Lord Stevenson, and other noble Lords talked about market concentration, and the report recommends how the Government should approach mergers and acquisitions in this unique online environment. The Government’s Modernising Consumer Markets Green Paper sought views on how well equipped the UK’s competition regime is to manage emerging challenges, including the growth of fast-moving digital markets. We continue to consider the options across the range of measures proposed in the Green Paper, including for digital markets, and are due to report in summer 2019. This will be informed by the work of the independent Digital Competition Expert Panel, led by Professor Jason Furman, which published its recommendations for Government on 13 March. The Prime Minister announced yesterday that Jason Furman has agreed to advise on the next steps on how we can implement his recommendation to create a digital market unit. We are considering his other recommendations, and will respond later this year.
On the digital authority, which was one of the key recommendations of the report, to, among other things, co-ordinate regulators in the digital world, we support the committee’s view that effective regulation of digital technology requires a co-ordinated and coherent approach across the various sector regulators and bodies tasked with overseeing digital businesses. They need clarity and stability, and the Government should lead the way in providing oversight and co-ordination of digital regulation, and ensuring consistency and coherence. We are carefully considering how existing and new regulatory functions, such as that proposed through the online harms White Paper, will fit together to create an effective and coherent landscape that protects citizens and consumers. However, we are also conscious of the calls for speed, which have been made by many noble Lords and stakeholders, not all tonight. On the one hand, we have to carefully consider the implications of new regulation, as the noble Lord, Lord Gordon, told us; on the other hand, there are serious harms that need addressing now.
When I say we are carefully considering it, we are carefully considering it. The noble Lord, Lord Stevenson, is looking as if he is not taking me seriously, but we are.
I apologise to the Minister. It was just that he said that he was considering it, and that he is considering it. It did not seem to advance the argument very much.
I was considering it, we are considering it, and we will consider it further. The worry we have is about speed, and setting up a completely new regulator, and co-ordinating the existing regulators, is what we have to worry about. The consultation is still going on, and that is something we can address.
The other main issue that several noble Lords have mentioned is about the 10 principles in the report, and the six principles in the charter, which I mentioned before. We have a set of principles that underpin the digital charter, and the online harms White Paper is part of the charter’s programme of work. The committee’s principles of regulation correspond with the White Paper approach. For example, on parity, what is unacceptable offline should be unacceptable online. However, the online harms White Paper does set out our intention to consult widely as we develop our proposals, so we will further consider the proposals as part of this, ahead of finalising new legislation.
The noble Lords, Lord McNally and Lord Stevenson, also mentioned pre-legislative scrutiny. We would like to consult thoroughly—we have had a Green Paper and a White Paper, both of which have had consultations that, we hope, will ensure that we get our proposals right. However, as I said before, there is a need for urgent action—that is increasingly evident—and we will take those factors into account when reaching a decision on whether to engage in pre-legislative scrutiny. We are not against it in principle—in fact, there are many ways in which it would be useful—but, having had two consultations already, we may decide in the long run that speed is more important and that we need to get things done.
As to the momentum to which the noble Lord, Lord Stevenson, referred, a Bill is definitely planned. It needs to be drafted after the consultation—which ends on 1 July—but it will not be easy legislation to frame if we are to capture all the areas that noble Lords have talked about. We have momentum and are keen to do it, as is the Home Office, which wishes to address particular issues such as child exploitation.
The noble Lord, Lord Stevenson, the right reverend Prelate and the noble Baronesses, Lady Harding and Lady Kidron, talked about age-appropriate design. The right reverend Prelate was concerned that we would row back from this. Age-appropriate design, or the kids’ charter—or, as I call it, the Kidron charter—is a part of the wider approach to tackling online harms and will play a key role in delivering robust protections for children online. We discussed it at length on the Bill. The ICO has been consulted formally on the code and will continue to engage with industry. We are aware that the industry has raised concerns—the noble Baroness, Lady Kidron, mentioned some of them—but it is not beyond the wit of such an innovative industry to deal with those technical concerns. It is important that the ICO continues to work with the industry to make sure that the measures are workable and deliver the robust protection that children deserve. The ICO has a reputation as a proportionate regulator and we will stand behind it.
The noble Lord, Lord Gilbert, asked about a classification framework akin to that of the British Board of Film Classification. We have said in the online harms White Paper that companies will be required to take robust action, particularly where there is evidence that children are accessing inappropriate content, and that we expect the codes of practice issued by the regulators to make it clear that companies must ensure that their terms of service state what behaviour and what activity is tolerated on the service, as well as the measures that are in place to prevent children accessing inappropriate content. The regulator will assess how effectively these terms are enforced. The classification framework is an interesting idea. We are consulting on developing our proposals and we will certainly include that.
The noble Lord, Lord Gilbert, also asked for important assurances that the press are outside the scope of the duty of care and how the Government intended to balance journalistic freedom with the regulation of online harms. The Secretary of State has been clear that this is not intended to include journalistic content. We do not interfere with what the press does or does not publish as long as it abides by the law of the land. A free press is an essential part of our democracy, so journalistic or editorial content will not be affected by the regulatory framework we are putting in place.
The noble Viscount, Lord Colville, and the noble Lord, Lord Stevenson, mentioned gaming addiction. I have written to the noble Viscount, who reminded me that a whole six weeks had passed and he wondered what we had done about it. I do not think he has been in government or he would know that that is asking a bit much, especially as the consultation is still going on and does not finish until 1 July. We do not want to duplicate what is regulated by other gambling and gaming regulators. We are clearly looking at that important issue, but it is not within the scope of this White Paper.
The noble Viscount mentioned the GDPR loophole. I will have to look at that. I always thought that data subjects had the ability to ask for decisions made by algorithms to be explained, whether or not it was with a person. I will have to check the legal position and get back to him on that.
As far as the e-commerce directive and liability is concerned, the new regulatory framework will increase the responsibility of online services, but a focus on liability for the presence of illegal content does not incentivise the systematic, proactive responses we are looking to achieve. We think the way we are doing it—with the duty of care—gives them the responsibility to be more proactive, and that the monitoring they have to do is within the scope of the e-commerce directive.
I once again thank the noble Lord and his committee for their report. I think we are aligned on some of the fundamental issues. The contributions this evening have shown that there is a depth of interest in this subject. If we get this right, we have an opportunity to lead the way and work with others globally. We will protect citizens, increase public trust in new technologies and create the best possible basis on which the digital economy and society can thrive.
I thank all noble Lords for their contributions to an excellent debate. I thank the noble Lords, Lord McNally and Lord Stevenson, for engaging in detail with the recommendations in our report, as well as the Minister, who answered all our questions at this late hour. He now has the unenviable task of grappling with the detail and bringing forward positive proposals to deal with these complex issues. He and his colleagues have engaged enthusiastically with the committee; I really thank them for that.
I agree with the noble Lord, Lord Maxton, who rightly highlighted all that is good with the internet and the danger of overregulation. That is why I think that the digital authority, with its forward-looking function of identifying risks before they emerge, will enable us to reach for not only regulatory solutions but, for example, public education campaigns to deal with those issues.
As we conducted this inquiry, I was struck by the amount of evidence we received, not just from industry and regulators but from great civic society organisations, academics, journalists and individual citizens who took time to write to us and send submissions, which the committee read with huge interest. In this day, where public service is not recognised, I thank them. We heard from some frankly heroic people who are using technology and the internet to improve the lives of others and to do good.
Finally, we heard some disturbing evidence from some of our witnesses about child sexual exploitation and other ugly aspects of our society, from organisations such as the Internet Watch Foundation, the National Police Chiefs’ Council and the National Crime Agency. They work in some very dark areas of society and must endure much personal anguish, but they displayed great humanity when they came and spoke to us. They do amazing work. In them, we saw the best of our society.