(11 years, 5 months ago)
Commons ChamberI am coming to that very point. Challenges remain, but the last thing we want to do is create the impression that this is a simple issue and that children and families can be protected at the flick of a switch; it is much more complicated than that and deserves an intelligent debate. We need to recognise the differences in these areas, rather than giving the impression, as some Members have, that the flick of a switch will make the difference. An ISP filter would be oblivious to the very risks from which we need to protect children. Furthermore, such filters would not protect against bullying, grooming or other serious risks, but at the same time they would give parents a false sense of security.
One of the most effective answers—there will be several answers, and filters have a part to play, but they are not the only solution—is for a parent to show a genuine interest in what is being viewed online. I am pleased that the debate over the past year or so has focused the minds of technology providers on making device-level and even profile-level security features and filters easier to use and understand. Google has its SafeSearch, for example, while Windows 8 has made significant steps: it can e-mail parents a list of all the sites viewed by a householder so that they can check themselves what the child has been looking at. Furthermore, now when someone signs up to an ISP or sets up a new router, they are asked what settings they want, not only for the household, but for each computer. It needs to go even further, however, down to profile level, because the same computer can be used by different people. It is important, therefore, that we have the right profile filter settings to protect the children using the computer. Clearly, technology companies need to do more to communicate that message and help parents further.
My comments so far have related to legal adult content, but we would all agree that the far more serious issues surround illegal content, particularly that involving the abuse of children—the area on which most of the recent public debate has focused. It is extremely important that we distinguish between legal and illegal content. This should not be a party political issue and there are no easy solutions. Some content might be distasteful, but might well be available on shelves of newsagents or shops in Soho.
I am running short of time, but if the hon. Lady will allow me to make my point, I might answer her question.
We need to recognise, however, that the policing of such shops is relatively straightforward and that in general children cannot access or stumble across such material. Appropriate filters should stop the “stumbling across” element, but that leaves us with the policing. We need to publicise the work of the IWF and reassure people who might report issues to it that they will not necessarily be compromised. Much attention is focused on search engine companies, and it is important that they play their part—they have a responsibility here—but having researched their activities, I am aware of some of the technology they use to identify illegal content. They can claim to be playing a part, therefore, but search engines need to be at the cutting edge of image analysis and coding—they need to be one step ahead of the perpetrators of these terrible offences.
By focusing the debate on search engines, as some Members did earlier, we are forgetting that hosting is where the offence effectively lies. If a website has been scratched from the search engine, the URL still exists and those seeking to view illegal content can go straight to that address. The IWF, which has been mentioned several times—I welcome the extra money made available to it today—has made a huge difference. Some 1% of the content it removes from the internet is hosted in the UK; 54% is hosted in north America; 37% is hosted across the rest of Europe and Russia; the figure for Asia is only 1%; and for South America it is even smaller. Those are the issues. It is an international problem.