(6 years, 6 months ago)
Lords ChamberMy Lords, I congratulate the right reverend Prelate the Bishop of St Albans on securing this debate, and the Church of England on the publication of its guidelines. For me, as a person of no faith, their inclusion of people like me—and indeed of other faiths—is also welcome.
I note that there are nine codes in total and five principles. While we were on the artificial intelligence Select Committee, the right reverend Prelate the Bishop of Oxford spoke of having 10 laws for AI. This resulted in a very amusing Guardian version of the 10 commandments for robots, the 10th being:
“Thou shalt remember that we can always unplug you if you get too uppity”.
Unfortunately, these words were not quite so effective when I tested them out on a teenager on an X-Box, who was about to win a game on Fortnite Battle Royale. In the end, the AI Select Committee managed to get it down to five principles that we published as part of our report in April 2018, subtitled Ready, Willing and Able? Understanding and establishing principles and codes for future generations in a world where the tech giants overshadow Governments in size, scale and reach is not only important; it is vital.
In spite of the car crash that is Brexit, I believe the UK remains in a strong position to lead on ethics and these guidelines are a useful contribution to that debate. In the same way that the UK led the way on the ethical debates around in vitro fertilization, the new online harms White Paper is a significant step forward in an area where the UK can lead. The statement by the Osaka G20 trade and economy Ministers, with the annexe on AI principles drawn from the OECD, shows that international agreement on the ethical issues in this area is possible.
The Minister will be aware that we on these Benches support the White Paper and have said so in our submission, with qualifications and comments. We agree that social media companies should have a new statutory duty of care to their users, above all to children and young people. As ever, I salute the work of the noble Baroness, Lady Kidron, who has tirelessly campaigned for the right to childhood. As she described, leaving it to the big tech firms to deliver on a voluntary basis is not working and is no longer an option. We support the Government’s adherence to the principle of regulating on a basis of risk and believe that Parliament and government have a clear role in defining the duty of care, with the chosen regulator settling the final form of the code.
We on these Benches believe that that regulator should be Ofcom. In our view, Ofcom has the necessary experience of producing codes and walking that tightrope between freedom of expression and duty of care. It also has the experience of working with other regulators. We believe that Ofcom should be set the task early of working on those draft codes, as children have waited long enough for this protection to be a reality for them. But we also recognise that the complexity of this issue would be best served by pre-legislative scrutiny and believe the Communications Act 2003 to be an excellent model. I am not saying that simply because my noble friend Lord McNally played such a significant role in that process. We also believe that an earlier Bill setting up the Centre for Data Ethics and Innovation as a regulatory advisory body in this field is important.
The right reverend Prelate posed an excellent question to the Government in his Motion: what steps are they taking,
“to promote positive social media behaviour”?
In a world where a President of the United States takes to Twitter to slate our Prime Minister, this feels like a surreal question to ask right now but is definitely one that should be asked. I look forward to hearing the Minister’s answers on it. I have only one other question for him to answer today, so I really hope that he will be able to respond to it in his summation. I am sure he agrees that schools need to educate children about how to use, and question, social media with the kindness and respect that the Church of England suggests. To achieve that, digital literacy, advice and support for children and parents is essential, as the noble Baroness, Lady Chisholm, described. It is good that there is now some evidence from Ofcom that children are learning to think more critically about the websites they visit, and that they recall being taught how to use the internet safely.
However, what of the generation that has been abandoned to the Wild West of the internet? What additional support can be given while we deliberate here about the best forms of legislation? I will be more specific. A year ago, to comply with GDPR, social media sites such as Facebook, WhatsApp, Pinterest, Instagram and others raised their age restrictions from 13 to 16. What happens to that cohort of children who are left behind, many of whom—according to Ofcom—were underage on social media already? Do parents have to inform on them and their mates, or close down the networks where they talk with each other about homework that is due? What advice does the Minister have for those parents? I ask only that he agree to look at this very specific issue—the children left behind by the ban as it was introduced—and if he would undertake to write to me, I would be very grateful.
Social media should and can be a place of truth, kindness, welcome, inspiration and togetherness. The Church of England’s principles for the use of social media should be commended for their optimistic goals. As the right reverend Prelate the Bishop of Chelmsford put it, we should be allowed to raise our expectations. These are goals which all of us should try to adopt.
(6 years, 9 months ago)
Lords ChamberMy Lords, it is excellent to follow the noble Lord, Lord Brooke, because I have worked with him on areas of addiction. I know of his campaigning in this area, and I admire and follow with interest his constant insistence on connecting it to health. I also thank the Minister for providing us with this debate. As the noble Lord, Lord Griffiths, rightly described it, it has been a good opportunity to have a fascinating conversation.
Every noble Lord has said that the White Paper is very welcome. To date, the internet, and social media in particular, have opened up huge freedoms for individuals. But this has come with too high a societal price tag, particularly for children and the vulnerable, as described by the noble Baroness, Lady Hollins. There is too much illegal content and activity on social media, including abuse, hate crimes and fraud, which has mostly gone unpoliced. As my noble friends Lord McNally, Lady Benjamin and Lord Storey said, we on these Benches therefore support the placing of a statutory duty of care on social media companies, with independent regulation to enforce its delivery. My noble friend Lady Benjamin was quite right to say that she was seated at this table a long time before many of us. The independent regulator could be something like the Office for Internet Safety, or, as described by the Communications Committee, the digital authority.
The evidence has been clear from the DCMS Select Committee, the Lords Select Committee on Communications, Doteveryone, 5Rights and the Carnegie Trust: they have all identified the regulatory gap that currently exists. A statutory duty of care would protect the safety of the user and, at the same time, respect the right to free speech, allowing for a flexible but secure environment for users. We agree that the new arrangements should apply to any sites that, in the words of the White Paper,
“allow users to share or discover user-generated content or interact with each other online”.
The flow between regulated or self-regulated providers of information and providers of platforms of unfiltered content is not something that your average teenage user of “Insta”, as they call Instagram, can distinguish—by the way, it is never Twitter they use; that is for “old people”. These Insta-teens do not distinguish between a regulated, substantiated information provider and inaccurate and harmful content or links. The noble Lord, Lord Puttnam, talked about digital literacy, which is absolutely essential. One of the greatest gifts we can give a new generation of children is the ability to question the content that is coming to them. Proper enforcement of existing laws, as mentioned by the noble Lord, Lord Anderson, is vital to protect users from harm. But the useful addition is that social media companies should have a statutory duty.
My noble friend Lord Clement-Jones so ably chaired the Select Committee report on artificial intelligence, Ready, Willing and Able?; a report that some of us talked about only last week. A year later, it is still keeping us all very busy with speaking engagements, and therefore my noble friend is very sorry that he cannot be here. He is currently in Dubai at the AI Everything conference to talk about AI and ethics. When the White Paper was published, he rightly said:
“It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly”.—[Official Report, 8/4/19; col. 431.]
He welcomed the Government’s stated commitment to these two aspects. The essential balance required was described by my noble friend Lord McNally, the noble Lord, Lord Kirkhope, and the noble Viscount, Lord Colville.
Parliament and Government have an essential role to play in defining that duty clearly. We cannot any longer leave it to the big tech firms such as Facebook and Twitter, as we heard from the noble Lord, Lord Haskel. We have been waiting and waiting for it to be done on a voluntary basis, and it is simply not good enough. It was therefore good to hear the Statement earlier today, on yesterday’s emergency summit, about self-harm and the commitment from some of the big tech firms to provide the Samaritans with support. However, the right reverend Prelate, my noble friend Lord Storey and the noble Baroness, Lady Thornton, were right about the need to follow the money and look at the level of investment versus the level of profit. I hope that the Minister will respond on that.
I want to explore in particular the use of regulators that currently exist. Our findings on these Benches, following a series of meetings with the main regulators and after hearing evidence, is that they are keen to get started on some of these areas. While I appreciate that we are still in a period of consultation, I would like to explore this issue, because the need to deliver soon for the whole current generation is significant.
Does the Minister agree that it may well be possible for the extension of regulatory powers to Ofcom to oversee the newly created duty of care? Does he agree that Ofcom could, in principle, be given the powers and appropriate resources to become the regulator that oversees a code for harmful social media content, and the platforms which curate that content, to prevent online harms under the duty? As the noble Viscount, Lord Colville, asked, what are the possibilities for the use of current regulators? Ofcom’s paper on this very issue, published last September, was very helpful in this respect. We heard from my noble friend Lord McNally about the success of the Communications Act 2003, and the scepticism beforehand about its ability to deliver. It runs in complete parallel to what is currently being debated about how it can apply to the internet—so it has been done before.
Likewise, how does the Minister view new powers for the Information Commissioner and the Electoral Commission, particularly in respect of the use of algorithms, explainability, transparency and micro-targeting? I apologise that I cannot provide more detail—I cannot seem to get on the internet here today, which is ironic—but there was a recent fascinating TED talk about the suppression of voting. It was about not just the impact on voting but trying to suppress voter turnout, which I find horrific. What are the possibilities for the ICO and the Electoral Commission to expand and take up some of these duties?
The White Paper refers to the need for global co-operation, and my noble friend Lord Clement-Jones is pursuing the possibility of the G20 in Osaka being used as a key platform for an ethical approach to AI. Is it possible that the White Paper agenda could be included in this? In particular, it is about using the recommendations on ethical principles from the AI Select Committee, and the five principles for socially good AI from the European Commission High-Level Expert Group. What are the possibilities around that, given that we are trying to push for global agreement on responsible use of the internet?
The noble Lord, Lord Knight, mentioned transparency. There must be transparency around the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. I very much welcome the fact that technology has been highlighted as something that is part of the solution.
For children, “not online” is simply not an option. Children at secondary school now have to use it to do their homework. It is no good me saying to my 13 year-old, “Get off your screen”, because he just might be on “Bitesize” or doing his maths. I have to get my kids’ meals paid for on this, so online is very much part of a child’s life. Screen-based activity could mean that they are doing their homework—fingers crossed.
However, I completely agree with what was said about the resistance of the gaming sector, in particular, to engage with this issue, and I support the noble Viscount, Lord Colville, on this. But my noble friend Lord McNally rightly pointed out our limitations generationally. Fair warning: I think that sitting through a popular vlogger on YouTube with 2 million subscribers describing the “Endgame” version of Fortnite to us would not enlighten us as legislators. It is therefore about us getting it right.
The noble Lord, Lord Anderson, said that we have been lucky. What I fear and worry about most of all is that today’s generation is not going to get the value of this White Paper, and that is particularly unlucky. Therefore, to get the balance right, as my noble friend Lord McNally rightly said, we should be considered in our approach. But I worry about the need to deliver quickly, and that is why I am asking whether there are regulators who can possibly trial some of this legislation in advance, rather than it going through pre-legislative scrutiny—because we know that some of them are ready to deliver on some of this.
I assume that when seat belts were originally discussed in a place like this, there were still some people who wanted children to bob about on the back seat in the name of freedom, without laws to protect them. Today that would be horrific and unheard of. We will probably look back and feel the same about children online. My noble friend Lord Storey is absolutely right: a child can find their way around the age issue. Give me a handful of 11 and 12 year-olds and I will show you the social media apps they are currently on. All of them are restricted to 13—but they are all on them. That is why the age verification issue must be tied in with younger people in particular, as was mentioned by the noble Baroness, Lady Howe, and my noble friend Lady Benjamin. In order to double or triple check, I ask the Minister whether it is being delivered in July. I thank him for his thumbs up; it is very well taken.
As I have said, we have a moral duty to move at quite a pace now. The White Paper is extremely welcome and I look forward to supporting its rapid progress through this House.
(6 years, 10 months ago)
Lords ChamberObviously, there are details that need to be ironed out, and that is partly what the consultation is about. I expect there to be a lot of detail, which we will go over when a Bill finally comes to this House. In the past we have dealt with things like the Data Protection Act and have shown that we can do that well. The list in the White Paper of legal harms and everyday harms, as the noble Baroness calls them, is indicative. I completely agree with her that the White Paper is attempting to drive good behaviour. The difference it will make is that companies cannot now say, “It’s not my problem”. If we incorporate this safety by design, they will have to do that, because they will have a duty of care right from the word go. They cannot say, “It’s not my responsibility”, because we have given them the responsibility, and if they do not exercise it there will be serious consequences.
My Lords, does the Minister plan to watch the last ever episode of the hugely successful comedy “Fleabag”, by Phoebe Waller-Bridge, tonight? Does he agree that it is perfectly possible to have brilliant and base dramas like “Fleabag” while protecting our children and the most vulnerable, and that Ofcom and other regulators have delivered that objective, balancing freedom of speech and protection from harm with considerable success since 2003? Does he agree that if we can invest in and enhance existing regulators to deliver protections from online harm as soon as possible, that is exactly what we should do, rather than asking our children to patiently wait for protections tomorrow that they really deserve today?
I agree with the noble Baroness that the television regulator and other media regulators have done a good job and that they are a good example. However, I will not be watching that programme, because I have an enormous amount of work today. If she promises not to ask any questions about the statutory instrument tomorrow, I might have a bit more time. But seriously, that shows that the decisions we are asking regulators to make are not easy. We are not trying to censor the internet. We want a vibrant internet which allows discussion, debate and different points of view but which does not allow some of the worst harms, which are indescribably bad. We need to deal with those, and we want to make the areas which are regulated offline also regulated online, in a reasonable and proportionate way.
(6 years, 10 months ago)
Lords ChamberIt is not completely fair to say that nothing has happened. In areas where personal data is used, for example, that has to be used lawfully under the aegis of the Data Protection Act. The Information Commissioner recently said that she was minded to issue guidelines on the use of data in respect of children. The Information Commissioner is a powerful regulator who is looking at the use of personal data. We also have the Digital Economy Act, and we have set up the Data Ethics Framework, which allows public bodies to use the data which informs algorithms in a way that is principled and transparent. Work is going on, but I take the noble Lord’s point that it has to be looked at fairly urgently.
My Lords, when the Chancellor asks the Competition and Markets Authority to scrutinise the transparency of Google and Facebook, are the Government confident that they are applying the same rules of transparency to public services in the UK? Is not waiting for an interim report a little bit too late, when the HART system used by Durham Police to predict reoffending, for example, is already well under way? Does the Minister accept that failure to properly scrutinise these kinds of algorithms risks the racial bias revealed by the investigation into the Northpointe system in Florida?
I understand that there are issues about facial recognition systems, which are often basically inaccurate. The essential point is that biometric data is classified as a special category of data under the Data Protection Act and the police and anyone else who uses it has to do so within the law.
(6 years, 11 months ago)
Lords ChamberI completely agree, and that is why, as I said in an Answer on tourism last week, the tourism sector deal concentrates on skills, recruitment and avoiding a high turnover in jobs. It is trying to make those jobs more long-term to provide the service that visitors rightly expect. The third-quarter figures were down, particularly for short-haul visitors, but they have rebounded. The Office for National Statistics reported a 4% increase in October.
My Lords, given last week’s finding of the employment tribunal regarding the National Gallery 27, which supported their legal claim to worker status—having been denied it for decades—does the Minister regret that precious resource from a DCMS body was spent in legal action to justify shoddy work practices? Will he ensure that their claim is settled soon and that the National Gallery is held to account for it? What advice are the Government now giving to other bodies using taxpayers’ money to apply the worst practices of the gig economy?
(7 years, 2 months ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Stevenson of Balmacara, for initiating this debate on such an important subject. It is timely because while so much seems to be at the stage on initiation, very little has reached a conclusion, so it is good to take stock. It is good that he has led us through a complex debate with his usual clarity. As ever, it has also been a real treat to hear in more detail about the work that the noble Baroness, Lady Kidron, has been doing in this area. She has already achieved so much in her work on the age-appropriate design code, with full support from these Benches and in particular from my noble friend Lord Clement-Jones. As we have heard, she is not satisfied with that and is pushing on to bigger and better achievements.
As a mum of a Generation Z 13 year-old, I am grateful for everything that the noble Baroness and the noble Lord, Lord Stevenson, are doing in this area. I guess the danger is that we will have sorted this only by the time we get to—what I believe we are now calling—Generation Alpha. It is possible we will look back on this time with horror and wonder what we did, as legislators who failed to move with the times, to a generation of children. While real joy comes from the internet, for a child the dangers are only too real.
The ICO call for evidence regarding the age-appropriate design code is very welcome, and I look forward to hearing the commitment that the noble Baroness, Lady Kidron, will be included every step of the way. An obligation will be placed on providers of online services and apps used by children. I just add that one of the difficulties here is dealing with children playing games such as “Assassin’s Creed”—which many under-18s play but is rated 18 due to bad language and serious gore—in the same way that for years children have watched movies with a slightly older age restriction.
Bar one other child, mine was the last of his contemporaries aged 11 to move from brick to smart phone. The head teacher of his secondary school asked all parents to check their children’s social media every night. It will come as no surprise to the expert and knowledgeable speakers here tonight that literally no one checks, so groups of children without the knowledge of how to edit themselves are not unusual on platforms from which they are all banned but still manage to sign up to. The five rights correctly identify that they will struggle to delete their past and need the ability to do just that.
As we know, kids are both tech wizards and extremely naive. You set screen times and safety measures and then discover they have created a new person. You have to release security to download stuff but you then realise they have accepted the kind of friends who call themselves David Beckham or whatever. On my last training for this, for safeguarding as a school governor, I was taught that children above 11 are now getting more savvy about online dangers, but it is the 8, 9 and 10 year-olds—or, as I prefer to call it, the Minecraft generation—who have an open door to literally everyone.
It is the school-age child we should continue ask ourselves questions about when we look at whether the legislation is working. As every school leader or governor knows, safeguarding is taken so seriously that we are trained again and again to check on safeguarding issues the whole time. However, the minute a smartphone is delivered into a child’s hand—or to the sibling of a friend, which is much more of a problem—the potential to cut across the best safeguarding rules are gone and the potential for harm begins. When the NSPCC tells us that children can be groomed through the use of sexting within 45 minutes, we have to act.
I would like us to cast our minds back to 2003—which, in internet years, I guess would be our equivalent of medieval times—when the Communications Act placed a duty on Ofcom to set standards for the content of programmes, including,
“that generally accepted standards are applied to the content of television and radio services so as to provide adequate protection for members of the public from the inclusion in such services of offensive and harmful material”.
That requirement stemmed from a consensus at the time that broadcasting, by virtue of its universality in virtually every home in country—and therefore its influence on people’s lives—should abide by certain societal standards. Exactly the same could be said now about social media, which is even more ubiquitous and, arguably, more influential, especially for young people.
However, it was striking to read the evidence given recently to the Communications Select Committee by the larger players—which, I must point out, is still in draft form. When those large social media companies were asked to ensure a similar approach, they seemed to be seeking greater clarity and definition of what constitutes harm and to whom this would happen, rather than saying, “Where do I sign?”
When the Minister responds, perhaps he could explain what the difference is now from 2003? If in 2003 there was general acceptance relating to content of programmes for television and radio, protecting the public from offensive and harmful material, why have those definitions changed, or what makes them undeliverable now? Why did we understand what we meant by “harm” in 2003 but appear to ask what it is today?
The digital charter was welcomed in January 2018 and has been a valuable addition to this debate. We hope for great progress in the White Paper, which I understand will be produced in early 2019. However, I am sure that others know better than me and perhaps the Minister will tell us. When he does, will he give us a sneak peek at what progress the Government are making in looking at online platforms—for instance, on legal liability and sharing of content? It would be good to know whether the scales are now moving towards greater accountability. I understand that Ofcom was a witness at the Commons DCMS Select Committee last week. It said that discussions had been open and positive and we would like to hear more.
I recently had the privilege of being on the Artificial Intelligence Select Committee. Our report Ready, Willing and Able? made clear that there is a need for much greater transparency in this area. Algorithms and deep neural networks that cannot be accountable should not be used on humans until full transparency is available. As the report concludes:
“We believe it is not acceptable to deploy any artificial intelligence system which could have a substantial impact on an individual’s life, unless it can generate a full and satisfactory explanation for the decisions it will take”.
I look forward to the debate on that report next week.
As with the AI Select Committee investigation, it is clear in this debate that there are many organisations in the field—from the ICO to Ofcom, from the Centre for Data Ethics to the ASA. The question becomes: is a single body is required here, or do we, as a Parliament, increase resource and put greater responsibility into one existing organisation? The danger of the lack of clarity and consistency becomes apparent if we do not.
I would welcome a comment from the Minister on the latest efforts in Germany in this area with its network enforcement law and its threatened fines of large sums if platforms do not rapidly take down hate speech and other illegal content. Does the Minister believe that it is possible to do that here? I was interested to hear that, as a result of such changes in German law, Facebook has had to increase its staff numbers in this safeguarding area—by a disproportionately large number in comparison with anywhere else in Europe.
The need for platforms and larger players to reform themselves regularly is starting to show. In the Lords Communications Select Committee session, Facebook was keen to point out its improvements to its algorithm for political advertising. Indeed, the large players will be quick to point out that they have developed codes and ethical principles. However, the AI Select Committee believes, as the Minister will have seen, that there is a need for a clear ethical code around AI with five principles. First, AI should be for the common good; secondly, it should be intelligible and fair; thirdly, it should not be used to diminish the data rights of individuals, families or communities; fourthly, everyone has the right to be educated to flourish alongside AI; and, fifthly, the power to hurt, destroy or deceive should never be vested in AI. Who could argue with that?
In a warm up for next week’s debate, I wonder whether the Minister believes, as I do, that whether we are pre-Brexit, post-Brexit, or over-a-cliff-without-a-parachute-Brexit—which is currently looking more likely by the day—we in the UK still have the capacity to lead globally on an ethical framework in this area. In the committee we were also able to provide clarity on responsibility between the regulatory bodies. It was useful work.
One of the first pieces of legislation I successfully amended in this place with colleagues on these Benches was the Criminal Justice and Courts Act 2015. A friend of mine who had been a victim of revenge porn had found how inadequate the legislation was and that the police were unable to act. The debate around it was typical of so many of the debates in this area. A whole generation of legislators—us—born well before the advent of the smartphone was setting laws for a generation who literally photograph everything. The dilemma became about how far ahead of what is already happening in society we need to be. It should be all the way and it is now a criminal act with an automatic sentence of two years. Unfortunately, awareness of this law is still quite low, but I would like to guide us towards the deterrence factor in this discussion.
While I have concentrated most of my comments on the future generations, a word needs to be said for the parents. The Cambridge Analytica scandal and the investigation into the spending by Brexit campaigners in the referendum suggest that the general public as well as children need help to protect them from micro-targeting and bias in algorithms—all delivered through social media platforms. There is a danger that this will further break the trust—if there is any left—in the political processes. It is a reminder that while fines and investigations highlight such practices and behaviours, they are not the only steps to take to deal with them.
The forthcoming White Paper will look at institutional responsibilities and whether new regulatory powers should be called on by either existing regulators or others. Again, any clarity on the thought process and, of course, the timescale from the Minister will be welcome. While we wait for that White Paper, we can all reach the conclusion that the status quo does not work. Governments cannot wait until this regulation debate becomes outdated. If “harm” as a definition was good enough for TV and radio content in 2003, it is good enough for content on social media platforms today.
(8 years ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Cormack, for introducing this all-important debate. Museums and galleries in the UK are available for all levels of interest, knowledge and understanding; indeed, they provide many of us with deeply personal and lifelong memories. I will never forget my visit to the Jorvik centre in York in my early 20s and the sense of magic as I held in my hand something amazing—Viking poo. It was in a block of acrylic, of course, but that was a magical moment that put me in touch with history none the less. The Jorvik centre was one of the first places in the UK to start to make the experience of the past come alive in such a creative way. Nor will I forget seeing my mum, an evacuee in the Second World War, sharing with her grandson the experience of the brilliant exhibition on evacuees at the Imperial War Museum.
The wide variety of museums and galleries that we have today will help to ensure that we foster a future generation who appreciate art, culture and our shared history. Indeed, we all have a responsibility to ensure that museums and galleries work hard to increase inclusivity and shine a light on those who have traditionally been left out of our story—Mary Seacole is an example. The noble Baroness referred to the running of dementia programmes in many museums and galleries now, and that is another example.
That 55% of the English public live within walking distance of at least one museum is a cause for much pride. That over half the adult population visit museums —up from around two in five a decade ago, according to the Mendoza Review—is encouraging. The Mendoza report also makes clear just how much value for money local museums provide for a very small share, as the noble Lord referenced, of the national expenditure. Museums in England generate £2.64 billion in income, including trading income, fundraising donations and grants in aid, and £1.54 billion in economic output, according to the Arts Council England report The Economic Impact of Museums in England in 2015. That is why, now more than ever, we need to ensure their future sustainability and stability.
As the noble Baroness, Lady Andrews, also referred to, there is an urgent need for central government to look at the funding issue. Indeed, if we are to believe that artificial intelligence will replace much of what we define as work today—and as a member of the Artificial Intelligence Select Committee, I have seen plenty of evidence so far to suggest that that is the case—it is all-important that we ensure that future generations have free access to creativity and culture that sets them apart from intelligent machines.
So it is worrying to learn that there is a decline in school visits, in part due to changes in the national curriculum. As a governor of an inner-city school, I have seen the value that is added when children visit areas of cultural interest. In particular, I have seen the value that is added for children who are receiving the pupil premium allowance. That leaves me in no doubt of the value. That is why I believe the curriculum must ensure that children develop with an understanding of the value of creativity.
In my view, the advent of artificial intelligence will need a highly creative and curious future generation. So we on these Benches recognise that to support the future success of the arts in Britain we must ensure that the right funding structures and regulatory environment are in place to encourage investment. But that investment must never compromise their independence. In other words, public galleries—galleries and museums that are free—should not be expected to rely solely on private income. The potential, or the danger, of our past being explained by the highest bidder, or by the whims of the latest fashion, may then become too great.
Adapting to today’s funding environment is the most important challenge facing museums today. Over the past 10 years, as we have heard, overall funding has reduced by 13% in real terms, part of that as a result of the cuts to local government. Museums and galleries are, sadly, likely to take the hit in an austerity period, regardless of the value that they add locally and culturally.
That is why we in the Liberal Democrats in particular support the creation of creative enterprise zones, zones that are set up to grow and regenerate cultural output across the UK, to grow jobs in the sector, to grow future generations armed for whatever uncertainty lies ahead with a rich and vibrant knowledge of the past from their local and national museums.
(8 years, 6 months ago)
Lords ChamberWe are going to liaise with civil society groups, as I have said, and academia. The Nuffield Foundation, for example, is going to develop plans in partnership with the Royal Society, the British Academy, the Royal Statistical Society and the Alan Turing Institute to establish an independent convention on data ethics. This is something we support and will contribute to, and I think the public will be able to learn from such conventions. As I say, we will update our thinking later in the year.
My Lords, will this commission cover not only ethics but the use and application of data, for example through machine learning and development of algorithms? Can the Minister also explain how this commission will interrelate with the new data protection regulations starting in 2018 and the digital charter announced in the Queen’s Speech?
The data protection Bill, which will come before Parliament in the autumn, is to give effect to the general data protection regulations and the law enforcement directive. It will obviously include things to do with privacy, but data ethics covers many other things, such as artificial intelligence, which the noble Baroness mentioned. So it is not specifically a regulatory thing, although regulation may come out of it. It is to consider the new issues that come with this new technology.