Online Harms Debate
Full Debate: Read Full DebateLord Griffiths of Burry Port
Main Page: Lord Griffiths of Burry Port (Labour - Life peer)Department Debates - View all Lord Griffiths of Burry Port's debates with the Department for Digital, Culture, Media & Sport
(5 years, 8 months ago)
Lords ChamberMy Lords, it is with pleasure and a great deal of relief that I speak to this Statement and the White Paper that lies behind it. Having sat through endless hours of the previous debates and the acrimony generated by them, and having found ourselves in places where I suspect none of us wanted to be, it is a pleasure to come to proper business again and to look at something that affects the whole of our society. We must find remedies and seek a legislative way forward that deals with the problems that we know are part and parcel of this innovative and brilliant thing that we call the internet and the technological advances that go with it.
Having read the White Paper and listened to the Statement, I am convinced that, across the Benches of this House, we must see this as unique in a party-political system in that we must act together. Consensual approaches and sensible resolutions to the problem are a duty that falls upon all of us. After all, the internet affects every part of our society—all of us have felt the questions it raises and enjoyed the wonderful opportunities it affords—so I hope that we can approach this in a consensual and cross-party way.
I congratulate the Government—is it not wonderful to hear someone from these Benches saying that?—on generating a report that is lucid and clear and will generate the kind of discussion that the consultative period, now beginning, will need. It is well laid out; my son is a printer, and he constantly beleaguers me about layout as I understand it and layout as he understands it, and he would be pleased with this. I can give no higher commendation. Congratulations are in order.
I know that we will have detailed, forensic debates when the results of the consultation are before us. At the moment, highlighting some of the headline aspects will have to do. The duty of care has been spoken to already and we must emphasise it; after all, we are all aware of those who are harmed by the abuse of the internet. Some well-publicised cases leave their images constantly before our eyes, especially when we think that some of them, indeed a lot of them, are children. In previous legislation that we have debated on the Floor of the House, we have talked about designing the internet in such a way that the interests and rights of children are protected. I am quite sure that we will take all that forward in the outworking of the further proposals in this White Paper.
We want to protect people from harms, and we will no doubt want to discuss what we think constitutes harms in the proper sense. There are indeed in this White Paper, rather conveniently, tabulated harms: those that are illegal, that are dangerous; that deserve attention. These are indicative lists, and no doubt we will want to move things from here to there and there to here, and add to and subtract from as time goes on, but it is a pretty good starting point to show us the range of conducts and activities that we will need to give attention to.
It is a bold White Paper. It claims to be bold and boasts of being bold. For me, there is one aspect that teases me, and I hope the Minister can give us some reassurance on it. It is the whole idea that while the internet and online activity affects us locally—in our homes and elsewhere—this has to be balanced against the fact that the companies, across whose platforms the material that generates these problems come, are global. We have seen how difficult it is to deal with the taxation aspects of these global companies. It will be equally difficult to think about legislation that could bring them all into line, and a word about that would be very helpful as we steer our way into the consideration of these proposals.
Statutory measures are mentioned, and I am delighted about that, of course, because these proposals and this way forward need to be underpinned by the full force of the law, and the regulator will be endowed with powers that are appropriate to the importance of the job. I wonder how we will bring a regulator to birth; some suggest that it should perhaps be an offshoot of Ofcom in the first place, that under the aegis of Ofcom we can get regulation built in to our way forward, and that it can evolve into something more complete later.
Any legislation that we bring forward will need to be nimble and flexible, because technology moves faster than the making of laws, and since the making of a law, as we know from the one we have been discussing, can be interminable, I hope that we will never be accused of tardiness in acting promptly, flexibly and nimbly to combat the downside of online activities.
So I congratulate the Government and I look forward to further debates and in greater detail.
My Lords, we, too, on these Benches welcome the fact that the Government’s proposals have come forward today, and we support the placing of a statutory duty of care on social media companies. We agree that the new arrangements should apply to any sites,
“that allow users to share or discover user-generated content, or interact with each other online”.
We think that is a fair definition.
We are all aware of the benefits of social media networks and the positive role they can play. There is, however, far too much illegal content and harmful activity on social media that goes undealt with by social media platforms and creates social harm. The self-harming material on Instagram and the footage of the Christchurch killings are perhaps the most recent examples.
Proper enforcement of existing laws is, of course, vital to protect users from harm, but, as the White Paper proposes, social media companies should have a statutory duty of care to their users—above all, to children and young people—and, as I say, we fully support the proposed duty of care. It follows that, through the proposed codes, Parliament and Government have an important role to play in defining that duty clearly. We cannot leave it to big private tech firms, such as Facebook and Twitter, to decide the acceptable bounds of conduct and free speech on a purely voluntary basis, as they have been doing to date.
It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly. I welcome the Government’s stated commitment to these two aspects.
We also very much welcome the Government’s adherence to the principle of regulating on a basis of risk and proportionality when enforcing the duty of care and drawing up the codes. Will the codes, as the Lords Communications Committee called for, when exercising powers of oversight, set out clearly the distinction between criminal, harmful content and antisocial content? By the same token, upholding the right to freedom of expression does not mean a laissez-faire approach. Does the Minister agree that bullying and abuse prevent people expressing themselves freely and must be stamped out? Will there be a requirement that users must be able to report harmful or illegal content to platforms and have their reports dealt with appropriately, including being kept informed of the progress and outcome of any complaint?
Similarly, there must be transparency about the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. We welcome the proposed three-month consultation period; indeed, I welcome the Government’s intention to achieve cross-party consensus on the crucial issue of regulating online harms. I agree that with a national consensus we could indeed play an international leadership role in this area.
Then we come to the question of the appropriate regulator to enforce this code and duty. Many of us assumed that this would naturally fall to Ofcom, with its experience and expertise, particularly in upholding freedom of speech. If it is not to be Ofcom, with all its experience, what criteria will be used in determining what new or existing body will be designated? The same appears to me to apply to the question of whether the ICO is the right regulator for the algorithms used by social media. I see that the Home Office will be drawing up certain codes. Who will be responsible for the non-criminal codes? Have the Government considered the proposals by Doteveryone and the Lords Communications Select Committee for a new “Office for Internet Safety” as an advisory body to analyse online harms, identify gaps in regulation and enforcement and recommend new regulations and powers to Parliament?
At the end of the day, regulation alone cannot address all these harms. As the noble Baroness, Lady Kidron, has said, children have the right to a childhood. Schools need to educate children about how to use social media responsibly and be safe online, as advocated by the PSHE Association and strongly supported by my party. Parents must be empowered to protect their children through digital literacy, advice and support. I very much hope that that is what is proposed by the online media literacy strategy.
At the end of the day, we all need to recognise that this kind of regulation can only do so much. We need a change of culture among the social media companies. They should be proactively seeking to prevent harm. The Government refer to a culture of continuous improvement being a desired goal. We on these Benches thoroughly agree that that is vital.