(4 years ago)
Lords ChamberPart of the reason for defining the scope in a way that excludes, for example, fraud is that it is not typically user-generated content; it is also the result of the point that the right reverend Prelate makes about speed of implementation, which is obviously paramount. The Government have recently announced a new national data strategy, which I am happy to share with him if he has not already seen it.
I also congratulate the Government on bringing forward this White Paper. It is time that those who generate such depravity and abuse of children are challenged. It is an issue in which I have a particular interest because during the 2017 general election campaign, when I fought to retain my parliamentary seat, together with my family I was subjected to a torrent of abuse online from anonymous contributors. Try as I might, I was unable to obtain the assistance of the leading social media companies to take action, so I have a simple question. In the response to the White, Paper, the Government talk of
“setting codes of practice, establishing a transparency, trust and accountability framework and requiring … companies to have effective and accessible mechanisms for users to report concerns”.
If this process is to be effectively policed, what additional resources will be provided to the regulator to enable an effective investigative and prosecuting regime to enforce against not just the social media companies but also the perpetrators? What oversight will there be to ensure that companies are not marking their own homework?
We are absolutely committed to the era of “marking their own homework” being over. We will obviously make sure that Ofcom, in particular, is sufficiently resourced in terms of capacity for the incredibly important task that it faces. Where Ofcom needs specific expertise—for example, a skilled person’s report—we are committed to that being made available.