(3 years, 1 month ago)
Grand CommitteeMy Lords, this has been a fascinating debate. I particularly applaud our committee chair, the noble Lord, Lord Gilbert and the staff he has already mentioned, for steering us through this inquiry—and, in the process, disappointing the noble Lord, Lord Grade, as he described earlier—which collided with one of the most significant periods to be a journalist, particularly in news: namely, the global pandemic.
Our aim at the start of this inquiry was to ensure that we examined innovative and sustainable platforms for the future of journalism. We were viewing that future in the context of the changes and challenges, particularly in technology, which have threatened traditional print media. If anything, the pandemic at the beginning accelerated some of those challenges, but it also highlighted a demand—a basic democratic right—for accurate, trusted news, particularly in the midst of the tragic global meltdown. The problem has been particularly acute at local level, where newspapers have closed and whole communities no longer have access to reliable local news and information.
The current existential threat, particularly in an unfair advertising market, and our hopes that the DMU has the necessary powers to tackle that issue and introduce a media bargaining code, were clearly and eloquently explained by the noble Lord, Lord Gilbert, the noble Viscount, Lord Colville, and the noble Baroness, Lady Buscombe. I look forward to hearing the Minister’s response to that specific query and I welcome him to his new role. I remind noble Lords, following the Australia experience, of the dangers in solutions that benefit only the larger publishers—a point I will develop later.
The difficult task in this report was to keep ensuring that we had set our sights on the innovations in the future and did not dwell too much on the grudges of the past—however tempting that may be, and as we have possibly heard once or twice today. In a world where social media has been the source of so much untrustworthy news, it was particularly important that people knew where to go and who to trust. They needed to know how to judge which information sources were trustworthy. My noble friend Lord McNally and the noble Lord, Lord Lipsey, gave us some of the answers to this vexed question. Clearly, one of those answers is to resolve the legacy of Leveson 1 and the failure to deliver on Leveson 2—something we as a committee could not examine, given the lack of consensus.
Until some of those fundamental issues of how to measure trust are solved, one area of evidence that was particularly interesting to me was the increasing range of organisations that provide online credibility ratings through either extensions or plug-ins—almost like nutrition labels. Companies such as NewsGuard, the Trust Project and the Journalism Trust Initiative are now often used by larger companies to make significant advertising spend decisions, thereby forcing traditional publishers to be more transparent. It is encouraging that, through this market solution, media outlets have had to provide information such as more detail about their journalists, about their correction policies and, above all, about who owns them.
If only the Government could extend that level of transparency to their own Public Service Broadcasting Advisory Panel, referenced in their response to our report but still to this day shrouded in mystery: it never reports and it never even reports when it meets. In fact, there is very little information about how it was selected in the first place. I wonder whether the Minister could enlighten us in his response.
New initiatives to create greater transparency and therefore trust are welcome, as is anything that puts greater control in the hands of the consumer—which is why it is so important that so many noble Lords have referenced digital literacy. I am only sorry that the Government’s response to me, unlike the noble Baroness, Lady Buscombe, was so uninspiring in comparison with the CLEMI initiative in France that we recommended they strongly consider. The lack of co-ordination in this area remains disappointing, and I believe it is a missed opportunity.
Another missed opportunity from the Government’s response is their refusal to play a greater role in co-ordinating some of the excellent initiatives such as the Nesta Future News Pilot Fund, the Facebook Journalism Project and the Google News Initiative, as the noble Lord, Lord Vaizey, described. He gave us an insight into how it is possible to be a broker and bring together and co-ordinate initiatives such as those. I urge the Government to reconsider their reply on the issue.
Another initiative which the Government, in their response to our report, warmly welcomed was the BBC’s local democracy reporting scheme, as referred to by the noble Baroness, Lady Wheatcroft, which, as they say, is making a valuable and diverse contribution to the sustainability of the press sector. A recent colleague of mine, Kiro Evans, is shortly leaving the joys of political public relations to go back to his first love, reporting, thanks to this scheme. He, like so many other young reporters, will be starting out in journalism and connecting people and communities through this great initiative. He is young, he is talented, he is black, and I hope he goes all the way.
While our report applauds some of the blind testing initiatives in the sector to ensure strong diversity, when we scratched the surface and looked in the boardroom and among the columnists, as in so many other sectors, the diversity challenge still has simply not been met. That is why I also support a revisit of the YouGov request about diversity.
Given the criticism of the LDRS when internally reviewed by the BBC last year and the fact that the vast majority of the scheme’s annual £8 million allocation goes to the UK’s three biggest local news groups—Reach, Newsquest and National World—it did seem a little like sour grapes when the representative body of those groups, the News Media Association, in a quote referenced by the noble Lord, Lord Faulks, attacked the BBC for investing in 100 new digital community journalists while its members benefit directly from that licence money.
The noble Lord, Lord Inglewood, touched on an issue which I found particularly interesting and made me hopeful for the future: charitable status criteria. We heard evidence that the UK lacks the philanthropic journalism which is able to register under the Internal Revenue Service’s tax code in the US. The Public Interest News Foundation was able to give us some insight into changes in France, Germany and Canada on this front. While I appreciate that, as the noble Lord explained, these remain baby steps, they are still worth examining and the Charity Commission’s decision that PINF is established for charitable purposes in recognition of public interest journalism is one to watch.
As Professor Steve Barnett put it in his written evidence, there is a need to support the
“growing culture of entrepreneurial journalism using digital media outlets, which are clearly capable of fulfilling some of the key informational, watchdog and investigative functions that local communities require”.
Sadly indicative of the attitude of this Government to such entrepreneurialism in media was the use of the “All in, all together” advertising funds at the start of the pandemic—a laudable initiative to prop up an industry in trouble. What a shame, then, that—as the then Minister John Whittingdale explained to us—the vast majority of those funds were allocated to members of the NMA, which represents the large publishers. A healthy media economy must surely include both large and small publishers; national groups and local independents; legacy print titles and digital natives. By favouring one part of the industry over another, the Government will inevitably foster suspicion and mistrust.
If we are to foster innovation and growth in this sector, the Government need to have an open mind about the small independents. Indeed, in both devolved nations, the Cairncross recommendations to invest in local news are currently under consideration. The blanket “No” from this Government is a wasted opportunity and, to me, seems frankly short-sighted.
I will spend as much time on the subject of Twitter as do most people going about their normal lives—and as the evidence in our report showed that they do. That has already been disproportionate and way too long. I expect that most people do not spend any time on Twitter. We seem to obsess about it far too much in this place, and most of the population are, very sensibly, elsewhere and not paying any attention. They are very sensible people for that.
Finally, I worry about a level playing field regarding the BBC. I want to go back to the comments I made at the start. Trust has never been so important. Which institution is by far the most trusted for news and information, not just about the pandemic but about all information and knowledge as we go about our lives as active, democratic citizens? It remains the BBC. Undermining that precious gift and global showcase is an act of self-harm.
Like others, I welcome this report heartily and hope that it makes a significant contribution. I am sad to have recently left the committee. It was an absolute joy to serve on and I wish all colleagues on it much success in the future.
(3 years, 6 months ago)
Lords ChamberMy Lords, as a member of this committee, I record my thanks to the noble Lord, Lord Gilbert of Panteg, for steering us through this and more recent reports. I also associate myself with his thanks to our advisers and staff. I started in this report’s final stages, so I also record my thanks to my noble friends Lady Bonham-Carter and Lady Benjamin— who preceded me and my noble friend Lord Storey from these Benches—and to all other members of the committee.
Little did we know when this report was published in November 2019 that, a year later, we would be in the midst of a global pandemic, when trusted sources of information, regulated to provide the public with impartial and accurate news, would be such a vital lifeline for so many of us, alongside high-quality online education such as BBC Bitesize and, of course, as much entertainment as we could get our hands on. How wonderful it is that this was all available to every household in the country, free at the point of use for those unable or unwilling to pay subscriptions for Netflix, Amazon or Sky.
This debate comes in the wake of the BBC’s independent judge-led Dyson inquiry into events of 25 years ago. While Bashir’s behaviour was shocking and subsequent management action lamentable, there have been fundamental changes in BBC accountability since then. The BBC is now under external regulation by Ofcom, and any such lapses in editorial standards would be swiftly exposed. It should certainly not be our focus today.
I intend to confine my comments to some of the broader strategic questions which this report set out to address—in particular, how the unique ecology of public service broadcasting in the United Kingdom can survive and thrive in a future where subscription video on demand, or SVOD, appears to reign supreme.
Why does it remain so important in the context of so much available content? We need only take a short hop across the Atlantic to get our first answer. In the US, fake news and polarisation of opinion ultimately ended with a President promoting violence to suppress the results of the ballot box. It gave us a graphic demonstration of a dystopian, unregulated future without well-resourced and trusted PSBs committed to accuracy and impartiality.
In the UK, we have a unique blend of publicly and commercially funded public service broadcasters which enhance our economy, culture and democracy. Indeed, last year’s Ofcom research, The Impact of Lockdown on Audiences’ Relationship with PSB, found that most audiences had a greater sense of its value on behalf of society as a whole. It also highlighted its value for older and more vulnerable audiences. That research revealed that audiences put greater value on the need for news that reflects the regions and nations. I guess if you live in Bolton or one of the other seven areas right now with constantly changing government advice, accurate information about what exactly is going on is an essential public service. Therefore, our recommendation that Ofcom should ensure that public service broadcasters uphold the spirit of regional news and production quotas is even more critical today. The BBC’s “Across the UK” plan and the move of Channel 4 headquarters to Leeds are both welcome initiatives in that area.
Even before the pandemic, the evidence in our report that PSBs are vital to our democracy and culture and to the UK’s image on the world stage was overwhelming. Commercial rivals in the UK also see our PSBs as a critical part of the make-up of the creative sector; the Commercial Broadcasters Association described them as the bedrock of the UK audio-visual sector.
PSBs have invested £2.6 billion in the UK, delivering 32,000 hours of original home-grown content—125 times more than Netflix, which is still, even in a time of Covid, lockdowns and “The Queen’s Gambit”, not making a profit. PSB investment gives underpinning stability to our creative industries that the uncertain funding of streaming services cannot. As the report concludes:
“PSBs provide a stable investment platform for a diverse range of content, made for UK audiences, and freely available on a reliable over the air platform.”
PSBs also provide event television, bringing the nation together. Just look at the nearly 13 million who watched the epic finale of “Line of Duty” or the 4 million who watched Jenny tearing up her notes on “Gogglebox” while watching the same programme.
Last year, sadly, we lost a member of this committee: Lord Gordon of Strathblane, whose long-standing experience in media was a huge asset to this report. He particularly advocated event television and extending the listed events regime, especially relating to sport. I am sorry that the Government rejected that recommendation and would like to hear why.
Finally, but vitally, as the noble Lord, Lord Gilbert, mentioned, mandated prominence is critical for PSBs across all devices. PSBs must be easy to find in a fragmented media environment, whether as channels or hubs or through their own portal. The Ofcom proposal to update this is critical; I hope the Government will support this initiative and bring forward legislation this year.
This is a nation that needs to heal from division and disease. Public service broadcasters have a vital role to play in that process. We must give them the resources and support to get on with it.
(3 years, 11 months ago)
Lords ChamberMy noble friend makes important points. Of course, we are co-operating with all the different three-letter acronyms that he mentioned and maybe many more—who knows? In all seriousness, there is also a balance to be struck in the delivery of this important legislation.
My Lords, this is a welcome move, if achingly slow. I have just a couple of questions. First, in Annexe A, companies are expected to assess themselves on whether their service is likely to be accessed by children. What level of confidence does the Minister have that companies will reveal themselves as having access to children? For instance, WhatsApp has changed its age limit twice since 2018. Is she confident that they will be honest about the number of children under the ages of 16 or 13 using their services? Does she accept that the decision to exempt online news organisations leaves open a back door to online harm? Under these proposals, the Daily Mail is still able to share the video of the Christchurch mosque attack, which Google and Facebook are not. Will she take a look at that issue?
I am aware that if my noble friend Lord McNally were asking a question right now, he would suggest that the pre-legislative scrutiny should be done by a Joint Committee. My plea on that—I declare an interest as a member of one of the relevant committees that will scrutinise this—is that speed is of the essence. Unless we are able to scrutinise swiftly, we leave many vulnerable to the internet. This has been too long in the making.
On the noble Baroness’s first point, I understand why she asks about it and we have given the matter careful consideration. Platforms will need to prove that children are not accessing their content by sharing any existing age verification or assurance information, by reviewing the data on their users. They will need to evidence that in a robust way to satisfy Ofcom. I shall take back the point regarding the Christchurch video. I know that my right honourable friend the Secretary of State talked about how he valued the expertise of both Houses, so I hope that is a warm note regarding scrutiny.
(4 years, 7 months ago)
Lords ChamberI am happy to agree with the points raised by my noble friend. There is an irony at the moment, when our thirst for quick news feels so urgent, that the time we need to take to get accurate news is even more important. I commend those journalists who are playing such an important part in achieving that.
Does the Minister agree that, if anything, there should be an acceleration to pass laws to make social media companies more accountable, with a duty of care and the use of criminal sanctions? Unfortunately, last week the Secretary of State appeared to be slamming on the brakes, asking them instead to beef up their systems and, in his words,
“drive reliance on reliable narratives”.
Any delay to online harms laws lets social media companies off the hook at this critical moment. Will the Minister agree to an urgent meeting with Peers to provide detail on the progress of this legislation?
I will be delighted to meet noble Lords to discuss this. I stress that the Government have been absolutely clear that we want the social media companies, which have unparalleled engineering capacity, to be even more proactive in addressing this very urgent threat.
(5 years, 4 months ago)
Lords ChamberMy Lords, I congratulate the right reverend Prelate the Bishop of St Albans on securing this debate, and the Church of England on the publication of its guidelines. For me, as a person of no faith, their inclusion of people like me—and indeed of other faiths—is also welcome.
I note that there are nine codes in total and five principles. While we were on the artificial intelligence Select Committee, the right reverend Prelate the Bishop of Oxford spoke of having 10 laws for AI. This resulted in a very amusing Guardian version of the 10 commandments for robots, the 10th being:
“Thou shalt remember that we can always unplug you if you get too uppity”.
Unfortunately, these words were not quite so effective when I tested them out on a teenager on an X-Box, who was about to win a game on Fortnite Battle Royale. In the end, the AI Select Committee managed to get it down to five principles that we published as part of our report in April 2018, subtitled Ready, Willing and Able? Understanding and establishing principles and codes for future generations in a world where the tech giants overshadow Governments in size, scale and reach is not only important; it is vital.
In spite of the car crash that is Brexit, I believe the UK remains in a strong position to lead on ethics and these guidelines are a useful contribution to that debate. In the same way that the UK led the way on the ethical debates around in vitro fertilization, the new online harms White Paper is a significant step forward in an area where the UK can lead. The statement by the Osaka G20 trade and economy Ministers, with the annexe on AI principles drawn from the OECD, shows that international agreement on the ethical issues in this area is possible.
The Minister will be aware that we on these Benches support the White Paper and have said so in our submission, with qualifications and comments. We agree that social media companies should have a new statutory duty of care to their users, above all to children and young people. As ever, I salute the work of the noble Baroness, Lady Kidron, who has tirelessly campaigned for the right to childhood. As she described, leaving it to the big tech firms to deliver on a voluntary basis is not working and is no longer an option. We support the Government’s adherence to the principle of regulating on a basis of risk and believe that Parliament and government have a clear role in defining the duty of care, with the chosen regulator settling the final form of the code.
We on these Benches believe that that regulator should be Ofcom. In our view, Ofcom has the necessary experience of producing codes and walking that tightrope between freedom of expression and duty of care. It also has the experience of working with other regulators. We believe that Ofcom should be set the task early of working on those draft codes, as children have waited long enough for this protection to be a reality for them. But we also recognise that the complexity of this issue would be best served by pre-legislative scrutiny and believe the Communications Act 2003 to be an excellent model. I am not saying that simply because my noble friend Lord McNally played such a significant role in that process. We also believe that an earlier Bill setting up the Centre for Data Ethics and Innovation as a regulatory advisory body in this field is important.
The right reverend Prelate posed an excellent question to the Government in his Motion: what steps are they taking,
“to promote positive social media behaviour”?
In a world where a President of the United States takes to Twitter to slate our Prime Minister, this feels like a surreal question to ask right now but is definitely one that should be asked. I look forward to hearing the Minister’s answers on it. I have only one other question for him to answer today, so I really hope that he will be able to respond to it in his summation. I am sure he agrees that schools need to educate children about how to use, and question, social media with the kindness and respect that the Church of England suggests. To achieve that, digital literacy, advice and support for children and parents is essential, as the noble Baroness, Lady Chisholm, described. It is good that there is now some evidence from Ofcom that children are learning to think more critically about the websites they visit, and that they recall being taught how to use the internet safely.
However, what of the generation that has been abandoned to the Wild West of the internet? What additional support can be given while we deliberate here about the best forms of legislation? I will be more specific. A year ago, to comply with GDPR, social media sites such as Facebook, WhatsApp, Pinterest, Instagram and others raised their age restrictions from 13 to 16. What happens to that cohort of children who are left behind, many of whom—according to Ofcom—were underage on social media already? Do parents have to inform on them and their mates, or close down the networks where they talk with each other about homework that is due? What advice does the Minister have for those parents? I ask only that he agree to look at this very specific issue—the children left behind by the ban as it was introduced—and if he would undertake to write to me, I would be very grateful.
Social media should and can be a place of truth, kindness, welcome, inspiration and togetherness. The Church of England’s principles for the use of social media should be commended for their optimistic goals. As the right reverend Prelate the Bishop of Chelmsford put it, we should be allowed to raise our expectations. These are goals which all of us should try to adopt.
(5 years, 7 months ago)
Lords ChamberMy Lords, it is excellent to follow the noble Lord, Lord Brooke, because I have worked with him on areas of addiction. I know of his campaigning in this area, and I admire and follow with interest his constant insistence on connecting it to health. I also thank the Minister for providing us with this debate. As the noble Lord, Lord Griffiths, rightly described it, it has been a good opportunity to have a fascinating conversation.
Every noble Lord has said that the White Paper is very welcome. To date, the internet, and social media in particular, have opened up huge freedoms for individuals. But this has come with too high a societal price tag, particularly for children and the vulnerable, as described by the noble Baroness, Lady Hollins. There is too much illegal content and activity on social media, including abuse, hate crimes and fraud, which has mostly gone unpoliced. As my noble friends Lord McNally, Lady Benjamin and Lord Storey said, we on these Benches therefore support the placing of a statutory duty of care on social media companies, with independent regulation to enforce its delivery. My noble friend Lady Benjamin was quite right to say that she was seated at this table a long time before many of us. The independent regulator could be something like the Office for Internet Safety, or, as described by the Communications Committee, the digital authority.
The evidence has been clear from the DCMS Select Committee, the Lords Select Committee on Communications, Doteveryone, 5Rights and the Carnegie Trust: they have all identified the regulatory gap that currently exists. A statutory duty of care would protect the safety of the user and, at the same time, respect the right to free speech, allowing for a flexible but secure environment for users. We agree that the new arrangements should apply to any sites that, in the words of the White Paper,
“allow users to share or discover user-generated content or interact with each other online”.
The flow between regulated or self-regulated providers of information and providers of platforms of unfiltered content is not something that your average teenage user of “Insta”, as they call Instagram, can distinguish—by the way, it is never Twitter they use; that is for “old people”. These Insta-teens do not distinguish between a regulated, substantiated information provider and inaccurate and harmful content or links. The noble Lord, Lord Puttnam, talked about digital literacy, which is absolutely essential. One of the greatest gifts we can give a new generation of children is the ability to question the content that is coming to them. Proper enforcement of existing laws, as mentioned by the noble Lord, Lord Anderson, is vital to protect users from harm. But the useful addition is that social media companies should have a statutory duty.
My noble friend Lord Clement-Jones so ably chaired the Select Committee report on artificial intelligence, Ready, Willing and Able?; a report that some of us talked about only last week. A year later, it is still keeping us all very busy with speaking engagements, and therefore my noble friend is very sorry that he cannot be here. He is currently in Dubai at the AI Everything conference to talk about AI and ethics. When the White Paper was published, he rightly said:
“It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly”.—[Official Report, 8/4/19; col. 431.]
He welcomed the Government’s stated commitment to these two aspects. The essential balance required was described by my noble friend Lord McNally, the noble Lord, Lord Kirkhope, and the noble Viscount, Lord Colville.
Parliament and Government have an essential role to play in defining that duty clearly. We cannot any longer leave it to the big tech firms such as Facebook and Twitter, as we heard from the noble Lord, Lord Haskel. We have been waiting and waiting for it to be done on a voluntary basis, and it is simply not good enough. It was therefore good to hear the Statement earlier today, on yesterday’s emergency summit, about self-harm and the commitment from some of the big tech firms to provide the Samaritans with support. However, the right reverend Prelate, my noble friend Lord Storey and the noble Baroness, Lady Thornton, were right about the need to follow the money and look at the level of investment versus the level of profit. I hope that the Minister will respond on that.
I want to explore in particular the use of regulators that currently exist. Our findings on these Benches, following a series of meetings with the main regulators and after hearing evidence, is that they are keen to get started on some of these areas. While I appreciate that we are still in a period of consultation, I would like to explore this issue, because the need to deliver soon for the whole current generation is significant.
Does the Minister agree that it may well be possible for the extension of regulatory powers to Ofcom to oversee the newly created duty of care? Does he agree that Ofcom could, in principle, be given the powers and appropriate resources to become the regulator that oversees a code for harmful social media content, and the platforms which curate that content, to prevent online harms under the duty? As the noble Viscount, Lord Colville, asked, what are the possibilities for the use of current regulators? Ofcom’s paper on this very issue, published last September, was very helpful in this respect. We heard from my noble friend Lord McNally about the success of the Communications Act 2003, and the scepticism beforehand about its ability to deliver. It runs in complete parallel to what is currently being debated about how it can apply to the internet—so it has been done before.
Likewise, how does the Minister view new powers for the Information Commissioner and the Electoral Commission, particularly in respect of the use of algorithms, explainability, transparency and micro-targeting? I apologise that I cannot provide more detail—I cannot seem to get on the internet here today, which is ironic—but there was a recent fascinating TED talk about the suppression of voting. It was about not just the impact on voting but trying to suppress voter turnout, which I find horrific. What are the possibilities for the ICO and the Electoral Commission to expand and take up some of these duties?
The White Paper refers to the need for global co-operation, and my noble friend Lord Clement-Jones is pursuing the possibility of the G20 in Osaka being used as a key platform for an ethical approach to AI. Is it possible that the White Paper agenda could be included in this? In particular, it is about using the recommendations on ethical principles from the AI Select Committee, and the five principles for socially good AI from the European Commission High-Level Expert Group. What are the possibilities around that, given that we are trying to push for global agreement on responsible use of the internet?
The noble Lord, Lord Knight, mentioned transparency. There must be transparency around the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. I very much welcome the fact that technology has been highlighted as something that is part of the solution.
For children, “not online” is simply not an option. Children at secondary school now have to use it to do their homework. It is no good me saying to my 13 year-old, “Get off your screen”, because he just might be on “Bitesize” or doing his maths. I have to get my kids’ meals paid for on this, so online is very much part of a child’s life. Screen-based activity could mean that they are doing their homework—fingers crossed.
However, I completely agree with what was said about the resistance of the gaming sector, in particular, to engage with this issue, and I support the noble Viscount, Lord Colville, on this. But my noble friend Lord McNally rightly pointed out our limitations generationally. Fair warning: I think that sitting through a popular vlogger on YouTube with 2 million subscribers describing the “Endgame” version of Fortnite to us would not enlighten us as legislators. It is therefore about us getting it right.
The noble Lord, Lord Anderson, said that we have been lucky. What I fear and worry about most of all is that today’s generation is not going to get the value of this White Paper, and that is particularly unlucky. Therefore, to get the balance right, as my noble friend Lord McNally rightly said, we should be considered in our approach. But I worry about the need to deliver quickly, and that is why I am asking whether there are regulators who can possibly trial some of this legislation in advance, rather than it going through pre-legislative scrutiny—because we know that some of them are ready to deliver on some of this.
I assume that when seat belts were originally discussed in a place like this, there were still some people who wanted children to bob about on the back seat in the name of freedom, without laws to protect them. Today that would be horrific and unheard of. We will probably look back and feel the same about children online. My noble friend Lord Storey is absolutely right: a child can find their way around the age issue. Give me a handful of 11 and 12 year-olds and I will show you the social media apps they are currently on. All of them are restricted to 13—but they are all on them. That is why the age verification issue must be tied in with younger people in particular, as was mentioned by the noble Baroness, Lady Howe, and my noble friend Lady Benjamin. In order to double or triple check, I ask the Minister whether it is being delivered in July. I thank him for his thumbs up; it is very well taken.
As I have said, we have a moral duty to move at quite a pace now. The White Paper is extremely welcome and I look forward to supporting its rapid progress through this House.
(5 years, 7 months ago)
Lords ChamberObviously, there are details that need to be ironed out, and that is partly what the consultation is about. I expect there to be a lot of detail, which we will go over when a Bill finally comes to this House. In the past we have dealt with things like the Data Protection Act and have shown that we can do that well. The list in the White Paper of legal harms and everyday harms, as the noble Baroness calls them, is indicative. I completely agree with her that the White Paper is attempting to drive good behaviour. The difference it will make is that companies cannot now say, “It’s not my problem”. If we incorporate this safety by design, they will have to do that, because they will have a duty of care right from the word go. They cannot say, “It’s not my responsibility”, because we have given them the responsibility, and if they do not exercise it there will be serious consequences.
My Lords, does the Minister plan to watch the last ever episode of the hugely successful comedy “Fleabag”, by Phoebe Waller-Bridge, tonight? Does he agree that it is perfectly possible to have brilliant and base dramas like “Fleabag” while protecting our children and the most vulnerable, and that Ofcom and other regulators have delivered that objective, balancing freedom of speech and protection from harm with considerable success since 2003? Does he agree that if we can invest in and enhance existing regulators to deliver protections from online harm as soon as possible, that is exactly what we should do, rather than asking our children to patiently wait for protections tomorrow that they really deserve today?
I agree with the noble Baroness that the television regulator and other media regulators have done a good job and that they are a good example. However, I will not be watching that programme, because I have an enormous amount of work today. If she promises not to ask any questions about the statutory instrument tomorrow, I might have a bit more time. But seriously, that shows that the decisions we are asking regulators to make are not easy. We are not trying to censor the internet. We want a vibrant internet which allows discussion, debate and different points of view but which does not allow some of the worst harms, which are indescribably bad. We need to deal with those, and we want to make the areas which are regulated offline also regulated online, in a reasonable and proportionate way.
(5 years, 8 months ago)
Lords ChamberIt is not completely fair to say that nothing has happened. In areas where personal data is used, for example, that has to be used lawfully under the aegis of the Data Protection Act. The Information Commissioner recently said that she was minded to issue guidelines on the use of data in respect of children. The Information Commissioner is a powerful regulator who is looking at the use of personal data. We also have the Digital Economy Act, and we have set up the Data Ethics Framework, which allows public bodies to use the data which informs algorithms in a way that is principled and transparent. Work is going on, but I take the noble Lord’s point that it has to be looked at fairly urgently.
My Lords, when the Chancellor asks the Competition and Markets Authority to scrutinise the transparency of Google and Facebook, are the Government confident that they are applying the same rules of transparency to public services in the UK? Is not waiting for an interim report a little bit too late, when the HART system used by Durham Police to predict reoffending, for example, is already well under way? Does the Minister accept that failure to properly scrutinise these kinds of algorithms risks the racial bias revealed by the investigation into the Northpointe system in Florida?
I understand that there are issues about facial recognition systems, which are often basically inaccurate. The essential point is that biometric data is classified as a special category of data under the Data Protection Act and the police and anyone else who uses it has to do so within the law.
(5 years, 9 months ago)
Lords ChamberI completely agree, and that is why, as I said in an Answer on tourism last week, the tourism sector deal concentrates on skills, recruitment and avoiding a high turnover in jobs. It is trying to make those jobs more long-term to provide the service that visitors rightly expect. The third-quarter figures were down, particularly for short-haul visitors, but they have rebounded. The Office for National Statistics reported a 4% increase in October.
My Lords, given last week’s finding of the employment tribunal regarding the National Gallery 27, which supported their legal claim to worker status—having been denied it for decades—does the Minister regret that precious resource from a DCMS body was spent in legal action to justify shoddy work practices? Will he ensure that their claim is settled soon and that the National Gallery is held to account for it? What advice are the Government now giving to other bodies using taxpayers’ money to apply the worst practices of the gig economy?
(6 years ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Stevenson of Balmacara, for initiating this debate on such an important subject. It is timely because while so much seems to be at the stage on initiation, very little has reached a conclusion, so it is good to take stock. It is good that he has led us through a complex debate with his usual clarity. As ever, it has also been a real treat to hear in more detail about the work that the noble Baroness, Lady Kidron, has been doing in this area. She has already achieved so much in her work on the age-appropriate design code, with full support from these Benches and in particular from my noble friend Lord Clement-Jones. As we have heard, she is not satisfied with that and is pushing on to bigger and better achievements.
As a mum of a Generation Z 13 year-old, I am grateful for everything that the noble Baroness and the noble Lord, Lord Stevenson, are doing in this area. I guess the danger is that we will have sorted this only by the time we get to—what I believe we are now calling—Generation Alpha. It is possible we will look back on this time with horror and wonder what we did, as legislators who failed to move with the times, to a generation of children. While real joy comes from the internet, for a child the dangers are only too real.
The ICO call for evidence regarding the age-appropriate design code is very welcome, and I look forward to hearing the commitment that the noble Baroness, Lady Kidron, will be included every step of the way. An obligation will be placed on providers of online services and apps used by children. I just add that one of the difficulties here is dealing with children playing games such as “Assassin’s Creed”—which many under-18s play but is rated 18 due to bad language and serious gore—in the same way that for years children have watched movies with a slightly older age restriction.
Bar one other child, mine was the last of his contemporaries aged 11 to move from brick to smart phone. The head teacher of his secondary school asked all parents to check their children’s social media every night. It will come as no surprise to the expert and knowledgeable speakers here tonight that literally no one checks, so groups of children without the knowledge of how to edit themselves are not unusual on platforms from which they are all banned but still manage to sign up to. The five rights correctly identify that they will struggle to delete their past and need the ability to do just that.
As we know, kids are both tech wizards and extremely naive. You set screen times and safety measures and then discover they have created a new person. You have to release security to download stuff but you then realise they have accepted the kind of friends who call themselves David Beckham or whatever. On my last training for this, for safeguarding as a school governor, I was taught that children above 11 are now getting more savvy about online dangers, but it is the 8, 9 and 10 year-olds—or, as I prefer to call it, the Minecraft generation—who have an open door to literally everyone.
It is the school-age child we should continue ask ourselves questions about when we look at whether the legislation is working. As every school leader or governor knows, safeguarding is taken so seriously that we are trained again and again to check on safeguarding issues the whole time. However, the minute a smartphone is delivered into a child’s hand—or to the sibling of a friend, which is much more of a problem—the potential to cut across the best safeguarding rules are gone and the potential for harm begins. When the NSPCC tells us that children can be groomed through the use of sexting within 45 minutes, we have to act.
I would like us to cast our minds back to 2003—which, in internet years, I guess would be our equivalent of medieval times—when the Communications Act placed a duty on Ofcom to set standards for the content of programmes, including,
“that generally accepted standards are applied to the content of television and radio services so as to provide adequate protection for members of the public from the inclusion in such services of offensive and harmful material”.
That requirement stemmed from a consensus at the time that broadcasting, by virtue of its universality in virtually every home in country—and therefore its influence on people’s lives—should abide by certain societal standards. Exactly the same could be said now about social media, which is even more ubiquitous and, arguably, more influential, especially for young people.
However, it was striking to read the evidence given recently to the Communications Select Committee by the larger players—which, I must point out, is still in draft form. When those large social media companies were asked to ensure a similar approach, they seemed to be seeking greater clarity and definition of what constitutes harm and to whom this would happen, rather than saying, “Where do I sign?”
When the Minister responds, perhaps he could explain what the difference is now from 2003? If in 2003 there was general acceptance relating to content of programmes for television and radio, protecting the public from offensive and harmful material, why have those definitions changed, or what makes them undeliverable now? Why did we understand what we meant by “harm” in 2003 but appear to ask what it is today?
The digital charter was welcomed in January 2018 and has been a valuable addition to this debate. We hope for great progress in the White Paper, which I understand will be produced in early 2019. However, I am sure that others know better than me and perhaps the Minister will tell us. When he does, will he give us a sneak peek at what progress the Government are making in looking at online platforms—for instance, on legal liability and sharing of content? It would be good to know whether the scales are now moving towards greater accountability. I understand that Ofcom was a witness at the Commons DCMS Select Committee last week. It said that discussions had been open and positive and we would like to hear more.
I recently had the privilege of being on the Artificial Intelligence Select Committee. Our report Ready, Willing and Able? made clear that there is a need for much greater transparency in this area. Algorithms and deep neural networks that cannot be accountable should not be used on humans until full transparency is available. As the report concludes:
“We believe it is not acceptable to deploy any artificial intelligence system which could have a substantial impact on an individual’s life, unless it can generate a full and satisfactory explanation for the decisions it will take”.
I look forward to the debate on that report next week.
As with the AI Select Committee investigation, it is clear in this debate that there are many organisations in the field—from the ICO to Ofcom, from the Centre for Data Ethics to the ASA. The question becomes: is a single body is required here, or do we, as a Parliament, increase resource and put greater responsibility into one existing organisation? The danger of the lack of clarity and consistency becomes apparent if we do not.
I would welcome a comment from the Minister on the latest efforts in Germany in this area with its network enforcement law and its threatened fines of large sums if platforms do not rapidly take down hate speech and other illegal content. Does the Minister believe that it is possible to do that here? I was interested to hear that, as a result of such changes in German law, Facebook has had to increase its staff numbers in this safeguarding area—by a disproportionately large number in comparison with anywhere else in Europe.
The need for platforms and larger players to reform themselves regularly is starting to show. In the Lords Communications Select Committee session, Facebook was keen to point out its improvements to its algorithm for political advertising. Indeed, the large players will be quick to point out that they have developed codes and ethical principles. However, the AI Select Committee believes, as the Minister will have seen, that there is a need for a clear ethical code around AI with five principles. First, AI should be for the common good; secondly, it should be intelligible and fair; thirdly, it should not be used to diminish the data rights of individuals, families or communities; fourthly, everyone has the right to be educated to flourish alongside AI; and, fifthly, the power to hurt, destroy or deceive should never be vested in AI. Who could argue with that?
In a warm up for next week’s debate, I wonder whether the Minister believes, as I do, that whether we are pre-Brexit, post-Brexit, or over-a-cliff-without-a-parachute-Brexit—which is currently looking more likely by the day—we in the UK still have the capacity to lead globally on an ethical framework in this area. In the committee we were also able to provide clarity on responsibility between the regulatory bodies. It was useful work.
One of the first pieces of legislation I successfully amended in this place with colleagues on these Benches was the Criminal Justice and Courts Act 2015. A friend of mine who had been a victim of revenge porn had found how inadequate the legislation was and that the police were unable to act. The debate around it was typical of so many of the debates in this area. A whole generation of legislators—us—born well before the advent of the smartphone was setting laws for a generation who literally photograph everything. The dilemma became about how far ahead of what is already happening in society we need to be. It should be all the way and it is now a criminal act with an automatic sentence of two years. Unfortunately, awareness of this law is still quite low, but I would like to guide us towards the deterrence factor in this discussion.
While I have concentrated most of my comments on the future generations, a word needs to be said for the parents. The Cambridge Analytica scandal and the investigation into the spending by Brexit campaigners in the referendum suggest that the general public as well as children need help to protect them from micro-targeting and bias in algorithms—all delivered through social media platforms. There is a danger that this will further break the trust—if there is any left—in the political processes. It is a reminder that while fines and investigations highlight such practices and behaviours, they are not the only steps to take to deal with them.
The forthcoming White Paper will look at institutional responsibilities and whether new regulatory powers should be called on by either existing regulators or others. Again, any clarity on the thought process and, of course, the timescale from the Minister will be welcome. While we wait for that White Paper, we can all reach the conclusion that the status quo does not work. Governments cannot wait until this regulation debate becomes outdated. If “harm” as a definition was good enough for TV and radio content in 2003, it is good enough for content on social media platforms today.