(11 years, 6 months ago)
Commons ChamberThe hon. Lady is absolutely right, but writing algorithms to do that on millions and millions of websites simply cannot be done correctly. I shall come back to that, although I know that the hon. Lady and the right hon. and learned Member for Camberwell and Peckham are not concerned about the errors that would be made.
It is absolutely right to provide tools for parents to control what is happening. They should be the ones empowered to look after their children. I would rather trust the parents to look after their children than require state-level controls. It is absolutely right to have those available for people to use and to make them easy and clear to use. I think there should be no default because I think we should encourage parents to engage with the question before they make a decision. They should be faced with a box that they have to tick, but they should be in charge. The Byron review was very clear that a false sense of security could be created if we just tell people that everything is safe.
The problem here is that we are not dealing with simply looking at a book or magazine and deciding whether it is suitable for a child. We are dealing with something that many people have said—this has been the focus of much of the research—they find very difficult to operate. The outcome is that many parents are not able to use those filters.
The hon. Lady is right, which is exactly why we need simpler filters. The work done by Talk Talk and others provides precisely that. There should be simple clear filters with simple clear questions so that parents can have a look and make a simple clear decision. I do not want to force parents to abdicate that responsibility because there are other consequences of these filters.
Any filtering system will have large errors. There will be errors that mean it does not filter out some things that we might want it to filter out because it cannot be sorted out perfectly. There is no way of indentifying automatically what counts as pornography and what does not; what is appropriate and what is inappropriate. That is simply impossible to achieve, so stuff will get through that we are not expecting to get through. There is also the problem of filtering some useful things out. There are already many cases—when it comes to advice for lesbian, bisexual and transgender issues, for example—where mobile phone providers automatically filter out the content, which can cause serious harm to young people trying to get advice. Trying to get advice about abortion services is another problem. There are a whole range of such issues that are automatically filtered out by many mobile phone providers. If we are telling children that we do not want to let them have appropriate information, that can be damaging.
We must not approach this at cross purposes. I do not think that anyone is saying that any of the proposals are perfect; we are merely seeking to improve the situation and to give greater protection. I have no doubt that there will be some very clever people who can find ways around all sorts of things—we know that that happens—but to say that we should not put such measures in place for that reason would be wholly wrong.
Let me address some of what the hon. Member for Cambridge (Dr Huppert) said. He seemed, perhaps surprisingly, to be setting the state against the parent in a way that is not helpful. Of course parents should be making decisions for their children, but there are many circumstances in which we have to rely on others in schools and in the wider world to protect our children. That is not an abdication of parental responsibility, because parents cannot be with their child all the time. They will not be able to supervise every social contact they have. As a parent, I would certainly prefer to be confident that I could let my children out into a world that I could regard as reasonably safe—whether that was the physical or virtual world—than to be unable to do so. Perhaps that is not what the hon. Gentleman was suggesting, but that was how it came across to me. Suggesting that such an approach somehow does not leave things to the parents and that it wants the state to step in is a wholly wrong way of considering the matter.
The concern is that the filters will be easy to bypass and that a huge proportion of young people will be able to get past them. If parents are led to believe that such things mean that their children will not be able to access inappropriate material when they are up in their room on their computer, that will lead them to make the wrong decisions about how best to look after their children. It is not that parents do not know how to do that—as we know from when parents were asked about the subject by Ofcom, this is a question of supervision, which is far more effective than a misleading sense of security.
One of the practical problems with that approach is the notion that someone is, in any sense, going to be there supervising their children through all this—we all do our best. We attempt to instil the values and behaviours that we want in our children, but it would be wrong to suggest that that will always work, even for those of us who think, or hope, we are or have been very good parents. Children grow older and they are out in that wider world—in friends’ homes or out in all sorts of other social contexts. I want my children to be protected from being able to buy alcohol when they are too young to buy it. I do not want to have to accompany them everywhere to make sure they do not do that; I want to be sure that they are, within reason, protected. However, I know that that can be got round as well, because I know that children are very good at getting fake IDs. That does not mean that we should just abandon the attempt to control these things.