Tim Loughton
Main Page: Tim Loughton (Conservative - East Worthing and Shoreham)Department Debates - View all Tim Loughton's debates with the Ministry of Justice
(11 years, 6 months ago)
Commons ChamberI do not profess any specific expertise, but if I have any, it is in relation to the work done on hate crime on the internet. I congratulate the Minister on his work with us. I also congratulate his predecessors, my right hon. Friend the Member for Barking (Margaret Hodge), and Barbara Follett, who is no longer a Member of the House, on their initiatives. All have been effective, and are appreciated.
I initiated a working group in the Inter-parliamentary Coalition for Combating Antisemitism two years ago. We have managed to get senior executives for content from most of the world’s biggest internet companies to sit on the group, including executives from Apple, Google, Facebook, PayPal, Microsoft and Twitter. We also have one of the key interlocutors in the US on free speech, Professor Jeffrey Rosen, and, from the Ministry of Justice, the seconded Association of Chief Police Officers lead on hate crime, Paul Giannasi.
A report has been produced—it has not yet been circulated, but will be in the next week in this country and throughout the world—that the Minister and the Government will find useful. The report is on the problem of hate crime, but the problem is the same as online protection of children in respect of the grey areas that need to be tightened, the technical solutions and approaches, and the mindset in the industry.
Part of the problem the group has identified is the shadow internet. It is fine setting up solutions, but if that happens in separate countries, people will break them if they want to—they have relatively easy ways to do so. The debate so far has concentrated on websites and search engines, but, in fact, even when it comes to child abuse, gaming is as big a problem and a vastly growing one. Texting, smartphones and social networking are equally significant, growing and changing problems—the modality is changing.
The group makes six recommendations in the report on hate crime—they are relevant to the debate. The first recommendation is to create clear policies and include them within the terms of the service of the internet company. That would be a significant change. The working group has the key players and the decision makers—they are not the sub-decision makers, but the actual decision makers. That recommendation is achievable, and it would be significant.
The second recommendation is for mechanisms to enforce those policies. How do intermediaries, including national Governments, enforce them? For international industries, the role of intermediaries, whether they are specialist groups or national Governments, is a second key principle in the approach that should be taken.
The third and vital recommendation, which resonates with this debate, is to establish clear, user-friendly processes to allow users to report abuse. Those processes are not currently there, but they are achievable. If mechanisms are in place, progress ought to be relatively straightforward—far more straightforward in relation to child abuse than hate speech, where issues of illegality are far more complex—where there is criminality. Clearly, there are technical solutions—I will not go so far as to suggest the software that the CIA has recently, allegedly, used—if the processes are in place.
The fourth recommendation is to increase transparency about terms of service enforcement decisions: case studies. For example, if an individual is prosecuted because someone has reported something that their child has stumbled across, the Government and other third parties have a critical role in how it will be reported and made public.
The fifth recommendation, which is probably specific to hate speech, is to encourage counter-speech. It is the same concept as the splash concept.
The sixth recommendation is to unite the industry. The industry will not always be American—with its concepts of free speech—so it is critical to achieve agreement within the industry while it still is.
If I can bring the hon. Gentleman back to the third recommendation, he makes a good point about reporting and taking down material. The IWF does a good job in that area. Apparently, last year 1.5 million adults came across abusive content on the internet, but only 40,000 reports were made to IWF, which has the powers to do something about it. There needs to be much greater publicity on how to report to ensure that action can take place.
Publicity on how to do so and technical ease of use in doing so, so that the democratic internet world can hit back effectively and the industry can be monitored, are key. The key members of the working group who really know what they are talking about would be more than happy to meet the Minister, if he would find that useful. We could bring them over from the US.
To get access to the right people, I went to meet industry leaders in their headquarters in California, and I made the point that their brands were in danger. If the users and third parties, albeit national Governments, can show successes in prosecutions, the industry will throw far more resources at the issue. The industry does throw at lot of resources at it. A third of all Facebook employees are dealing with it, because the dangers to its brand are so fundamental, but at the moment it is less of an issue for other companies. They do see the dangers to their brand, however, which is why senior people from PayPal now turn up to meetings.
I intervened on the Minister—it was not a hostile intervention—on agreements in other countries. One danger is that different countries will do different things. Of course, that is not an excuse for any Government to hold back, but the French Government are taking various legal actions against some of the key internet giants, as are the Italians, and there is a danger that the approach will become too bitty. May I suggest to the Minister that he try to up the stakes and achieve European Union consensus from Britain’s lead? If Britain is ahead of the rest of the European Union, that is a good opportunity to set the standards that others can push up to and take forward. That would be pragmatic and significant. We attack the industry—I am happy to attack the industry in various ways—but it does not want terrorists using its platforms to kill people and it does not want paedophiles using their products to abuse children. That is obvious to me and it is also obvious to the industry.
I do not know what the certificate was, but may I just get on with my speech?
My point is that we trust the BBFC’s classifications. When the Video Recordings Act 1984 was passed, more than 25 years ago, certain video works—I will come to online content in a second—were made exempt from classification because they were considered unlikely to be harmful. However, the content of exempt works has changed beyond recognition since 1984, which means that inappropriate and potentially harmful content can be legally supplied to children. On 24 May 2013, the Government announced that it planned to lower the exemptions threshold in order to prevent children from accessing potentially harmful material, so well done to the Government. This is a most welcome decision, for which the BBFC—along with the home entertainment industry, the recorded music industry, retailers and law enforcement bodies—had argued for some time.
Once implemented, the decision will improve the protection that children enjoy from potentially harmful media content by ensuring that video content such as drug misuse, strong violence, racist language and certain sexual content can no longer legally be freely supplied to children. Instead, the BBFC will classify such content to keep it away from vulnerable and impressionable children. The Government have said that they hope to have the new regime in place by April 2014, and I very much hope—I know that the Minister is listening carefully—that the Government will keep to that timetable, which requires secondary legislation. However, the legislation has never covered online content, and there is now particular concern about the content of online music videos.
My hon. Friend is making a good point about the Government’s welcome announcement. There is still a problem though, because although there is some classification of adult content and 18 video ratings in gaming now, Auntie Mabel who buys a video for her grandchild at Christmas needs to be made absolutely aware of the severity of some of the content to which she might inadvertently be exposing her grandchildren. We need better information in the shops and on the part of retailers at the point of sale, so that she can ask whether she really wants her grandchildren to see that sort of content.
My hon. Friend makes a powerful point. I am sure that those on the Front Bench have taken it on board, and no doubt the Minister will deal with it explicitly in winding up.
The issue of online music videos, to which the Bailey report also referred, must be seriously considered. My attention was recently drawn to an online video made by a well known pop singer—I had not heard of her before, but never mind—which showed explicit shots of a young teenage girl, concerned about her body image, slitting her wrists in the bath. It is the video to a well known song—I remember hearing it in my house. Although it has a happy ending, I would argue that the graphic scenes in that video—which I am sure parents would allow their children to watch in a very relaxed way—are far too explicit and dangerous for young teenage children to watch. We all know that many of the children who follow these pop stars are very young and impressionable. At the very least, online videos should contain some kind of classification.
The Government are rightly pressing the music industry voluntarily to adopt age-appropriate ratings for online music videos. In response to a parliamentary question from the hon. Member for Bishop Auckland (Helen Goodman), the Under-Secretary of State for Education, my hon. Friend the Member for Crewe and Nantwich (Mr Timpson) said:
“The Government will now take action to: make sure that online music videos carry labels that show their age suitability, in order to protect children from harmful material; and make it even easier for parents to keep their children safe online, wherever they are and in whatever way they might access the internet.”—[Official Report, 6 June 2013; Vol. 563, c. 1263W.]
The onus has therefore been placed on the music industry to come forward with a system that will work.
The BBFC hopes to work with the recorded music industry towards the goal of achieving well understood and trusted age ratings and content advice for online music videos, as it has done successfully with the home entertainment industry in relation to other online videos. The BBFC has now rated more than 200,000 videos for online distribution by such companies as Walt Disney, 20th Century Fox, Paramount, Universal and Sony. BBFC ratings are used by platforms such as iTunes, Netflix, blinkbox, BT Vision and TalkTalk—some of which I had heard of.
No, I have not used any of them.
One obvious solution that the music industry could consider in response to the Government’s demands for age-appropriate ratings for online music videos would be to adopt BBFC classifications voluntarily online. Does the Minister agree that that would be a constructive way forward?
My final point relates to user-generated content—UGC. Independent research from June 2011 shows that while the public believe that the internet brings greater choice, freedom and flexibility, the majority of viewers still consider it important to be able to check the suitability of the audio-visual content that they download, with 85% of the public considering it important to have consistent BBFC classifications available for video-on-demand content. The figure rises to 90% for parents of children under 16.
However, it is amateur user-generated content such as that seen on YouTube that makes up the majority of video content online. This might feature content that is potentially harmful to children—I accessed the video to which I referred earlier through YouTube this morning—and it is presently unregulated. The BBFC and the Dutch regulator NICAM have together developed a tool for ordinary people to age-rate UGC across different countries and platforms. I hope that my technological friend to my right, my hon. Friend the Member for Vale of Glamorgan, will consider that a good thing.
The tool is designed to enable those with responsibility for children to make fully informed viewing choices about non-professional content online. Through a single, simple, free-to-complete questionnaire, the tool instantaneously produces an age rating that can be shown on screen. The ratings can differ from country to country to reflect different national sensitivities and concerns over content. The tool is simple. It contains six questions about the content of the UGC, on behaviour, drugs, horror, language, sex and violence. Completing the questionnaire takes less than a couple of minutes. It also includes a facility for viewers to report content that, in their view, might be illegal. In the UK, such a report would go direct to the Internet Watch Foundation, about which much has been said this afternoon.
The tool is also flexible. For instance, the questionnaire may be completed by those uploading content. Alternatively, it may be completed by those viewing the content online. The ratings can be linked to online filters. This new initiative will shortly be trialled by Mediaset in Italy, and the BBFC and NICAM are looking for trial partners elsewhere, including in the United Kingdom. This is an example of the kind of initiative that can make the online world safer for children, and it has been welcomed by the EU Commission’s Safer Internet Coalition. I very much hope that our Government will get behind this initiative to help parents and children to make better informed choices about user-generated content. As we have heard this afternoon, there is no silver bullet on this issue, but with such incremental advances, our children will be better protected.
The hon. Lady is right, which is exactly why we need simpler filters. The work done by Talk Talk and others provides precisely that. There should be simple clear filters with simple clear questions so that parents can have a look and make a simple clear decision. I do not want to force parents to abdicate that responsibility because there are other consequences of these filters.
Any filtering system will have large errors. There will be errors that mean it does not filter out some things that we might want it to filter out because it cannot be sorted out perfectly. There is no way of indentifying automatically what counts as pornography and what does not; what is appropriate and what is inappropriate. That is simply impossible to achieve, so stuff will get through that we are not expecting to get through. There is also the problem of filtering some useful things out. There are already many cases—when it comes to advice for lesbian, bisexual and transgender issues, for example—where mobile phone providers automatically filter out the content, which can cause serious harm to young people trying to get advice. Trying to get advice about abortion services is another problem. There are a whole range of such issues that are automatically filtered out by many mobile phone providers. If we are telling children that we do not want to let them have appropriate information, that can be damaging.
I should declare an interest as a champion of the Internet Watch Foundation. I am slightly disappointed at the rather defeatist attitude taken by the hon. Gentleman. The solution is not a silver bullet. It is not any one of the individual things that have been mentioned; it is a jigsaw. Empowering and giving resilience and confidence to our children—and confidence, resilience and expertise to their parents to be able to filter what they believe to be right and wrong—is an important part of that jigsaw. Filters have their flaws, but they are part of that jigsaw as well. Will the hon. Gentleman admit that some of the things mentioned during the debate are part of the solution and that we should not dismiss them simply because they are not absolutely perfect?
I agreed with almost everything the hon. Gentleman said, until the end. Yes, I think we should empower parents to make the correct decisions, and I believe we should educate children so that they can think for themselves and be empowered. I absolutely agree with all of that, but that is not what the motion says and it is not what the hon. Member for Bishop Auckland emphasised. The hon. Gentleman and I would agree that there are some important measures for empowerment: the problem is, if we provide an illusion of protection, which gives people a false sense of security, that can make people less safe. It can leave children more exposed than doing things that actually work. It also downgrades the role of parents and parenting.
Moreover, we must accept that any filter can be bypassed. It is easy for those who know what they are doing to carry out a quick Google search and find out how to bypass any filter that they encounter, and there is no way in which we could prevent that from happening. We must therefore try to engage with people rather than introducing state control in the form of legislation to force search engines to run in a particular way, because that does not work. [Interruption.] The motion calls for legislation. If the hon. Member for Bishop Auckland does not believe that it should, that is her problem. Perhaps it suggests that motions should be tabled rather earlier than a few hours before the deadline for any changes.
Yes, we must do something, but what we do must work, must be proportionate, and must make things better for the people about whom we are concerned. That, rather than what was suggested by the hon. Member for Bishop Auckland, is the way forward. I commend the Minister—it is good to see him back in the Chamber—for his work on the issue, for his commitment to trying to deal with the problems in a way that will make a difference, and for the position that he has taken today.
It is a pleasure to follow my hon. Friend the Member for Clwyd South (Susan Elan Jones). I declare myself a dinosaur where online issues are concerned. I was going to say the same thing about my hon. Friend the Member for Hackney North and Stoke Newington (Ms Abbott), but she is much more modern than I am. Although she, I and you, Madam Deputy Speaker, were elected 26 years ago yesterday, she is thoroughly modern in her approach. She was able to name the Pokémons as one of the groups that children look at online, though Pokémons are perfectly fine as creatures and they probably need protection from the children.
In the short time available to us to speak, let me say that I normally go to the hon. Member for Cambridge (Dr Huppert), who is a member of the Home Affairs Committee, for advice on these matters, and I listened carefully to what he said about filters. However, I think the real responsibility is on the internet companies and the service providers. They have got away with murder—literally, in some cases—because people have been able to use the internet to groom young girls and children and to behave in an irresponsible way. The internet companies throw up their hands and say that is freedom of speech.
We recently had some of those companies before the Home Affairs Committee during our last inquiry and also during a previous inquiry, so we have questioned them about both the roots of radicalism and e-crime. We will invite them again when we look at this matter again. They are very reluctant to intervene, and a tiny proportion of their profits—a tiny proportion—goes to the Internet Watch Foundation. It is not enough. They cannot sit back complacently and allow these things to go on without intervening and cleaning up the internet.
The Home Secretary has made positive statements, after what happened in Woolwich, about her desire to get things done. I am glad that there is a summit next week. I hope that she will be invited and that this is not just being seen as an issue for the Department for Culture, Media and Sport, because when dealing with crime it is important to ensure that the police are fully involved.
The right hon. Gentleman makes a good point about the search engines, most of which are based in America, pleading freedom of speech. Does he agree that every search engine could have a simple sign on its home page alerting users to how they can report material they are concerned about, which would cost nothing? That way, there would be no excuse for not knowing what to do. They could also put money into having moderators to ensure a rapid response to unacceptable material.
Yes, and I pay tribute to the hon. Gentleman for all the work he did in that area as Children’s Minister and since then. The internet companies must be proactive. They have to go in and clean up the internet. They cannot just sit back and allow others to do it for them. It is so difficult to get internet companies to appear before Select Committees. It takes an age to find them, and then they always respond by saying that they are based in California or New York and therefore do not come over to the UK. They send us their public relations officers, but they, very nice people though they are, are not the decision makers.
I am full of praise for the work CEOP does. I have visited it, along with members of the Home Affairs Committee, and encourage other right hon. and hon. Members to go—it is just across the Vauxhall Bridge road—and see the fantastic work being done. I pay tribute to Jim Gamble for his work in setting it up in the first place and to Peter Davies, who leads it ably. I say to the Policing Minister—he is now in conversation with the Under-Secretary of State for Culture, Media and Sport, the hon. Member for Wantage (Mr Vaizey), who has done a great deal of work in these matters, for which we are grateful—that it is very important that we protect CEOP’s budget. The Home Affairs Committee expressed concern that CEOP was being put into the National Crime Agency. We accept what the Government have done and understand the need to rationalise the policing landscape, but it is important to maintain CEOP’s budget and focus. I understand that its budget will be cut by 10% over the next four years. Perhaps the Minister can reassure me that that is not the case and that CEOP, even though it is in the NCA—the Committee thinks that is fine for the moment, but we will revisit the subject—will still retain its focus. Ultimately, it provides terrific expertise that could benefit police forces across the country.
Finally, I recently visited Europol and Interpol. I urge the Policing Minister to visit those organisations, because I gather that no Home Office Minister has visited Europol in recent years. They are doing some fantastic work internationally. I know that the Cabinet Office has funded a project in Interpol specifically dealing with online child exploitation. I think that we can take credit for the work we are doing internationally. To return to my first point, the internet is a marvellous invention and a power for good, but as we have seen, and as we have heard today, it can be used in a different, darker way to exploit children. I hope that internet service providers and others involved in this whole area will understand their responsibilities and act accordingly.