Wednesday 31st October 2012

(11 years, 6 months ago)

Westminster Hall
Read Hansard Text

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

[Albert Owen in the Chair]
14:30
Fiona Mactaggart Portrait Fiona Mactaggart (Slough) (Lab)
- Hansard - - - Excerpts

It is a pleasure to appear in this Chamber in front of you, Mr Owen. I feel as though I have spent most of the day here. I am pleased to have been able to secure this debate, for which I have been pressing for some weeks.

Politicians and companies alike have failed to address the new challenges that the internet brings. I am not at this point arguing that the state needs to do more right now, although it might need to in future. Companies that use the internet need to have robust policies to protect vulnerable users. They need to take responsibility for the impact of what they do from the start of their operations.

Children and young people are a substantial and persistent body of internet users. A report published in September 2012 by McAfee suggested that 82% of five-year-olds

“have access to a computer, smartphone, tablet or other way of getting online.”

Nine out of 10 of those aged between 12 and 15 live in homes with internet access. In schools, use of the internet is now more or less universal. Increasingly, it is being integrated into lesson plans to make use of richer content, and it is often a regular part of how schools communicate with parents.

The internet is used at home to enable children to do their homework. It is a major linchpin or communications hub in huge numbers of children’s social lives. Indeed, not having access to the internet can mark someone out as odd, or as coming from a disadvantaged family.

With the rise of smartphones and other internet-enabled portable devices such as games consoles, and the emergence of large-scale public wi-fi, internet access is also pretty ubiquitous, or soon will be in all our major cities. Thus the notion that parents could in any meaningful sense provide constant support or supervision of their children’s use of the internet is becoming impossible to sustain. I make these points in part to underline a core element of my argument about industry’s responsibility, which I will come to later.

First, I will say a word about the industry. In fact, there is no such thing as the internet industry. At one point there was: back in the 1980s and early ’90s. Computers and networking had been well-established for years, so the then new internet industry essentially consisted solely of internet service providers and geeks who wrote software. It was all very neat and tidy, and easy to identify and deal with.

Today almost every business of any size has some sort of stake in the internet. All of them have a responsibility of some sort to people who go online, especially to children. Many of them make great efforts to discharge that responsibility with great care and attention, but I am afraid that it is also quite plain that many do not. It is the many that we need to focus on.

The internet is not a sort of social service, or an extension of the classroom with knobs on, like social networking sites. Just as money is said to make the world go round, it most certainly makes the internet go round, and children are right in the middle of it. In 2006, children and young people in the UK up to the age of 19 spent £12 billion from their pocket money, or from earnings derived from part-time jobs. Of that, £1.53 billion went on clothes, and £1 billion on food and drink; music and computer-related projects took another £1 billion. In the same year, when account is taken of the amounts spent by parents on their children or in the home—spending over which children and young people often have influence—the total value of the market increased to almost £100 billion.

One of the largest of the virtual worlds aimed expressly at young children is Club Penguin. When Disney acquired the business in 2007, it was reported to have paid $700 million. According to the Financial Times, in June 2011, the UK-based children’s site, Moshi Monsters, was reported to be valued at £125 million. Children and young people are therefore major economic actors, both in their own right and through the influence that they exert on patterns of consumption within the wider family.

The size of the market helps to explain why so many different companies are interested in children and young people. It is not just about cash tomorrow; it is very much about cash today. Moreover, the sums indicated suggest that this market matters not only to the individual firms that may be competing for parts of it, but for the national economy.

Children’s and young people’s online spending is also growing. A report published in December 2010 suggested that British kids between the ages of 7 and 16 spent £448 million, with eight out of 10 using their parents’ cards, online accounts, or PayPal. Apparently, £64 million was spent without parents’ knowledge.

The emergence of the internet as a major force in commerce, particularly in retailing, has created a number of anomalies in policy, as well as market distortions that discriminate against companies that trade solely or principally on the high street, but some of those anomalies are connected to wider risks to children and young people. Many of the rules established to protect children and young people from unfair or age-inappropriate commercial practices in the real world do not yet seem to have been fully translated into the virtual space, or to have found an online equivalent or proxy. There is a tendency for firms to say that what children do when they go online is entirely the responsibility of the parents or carers. While no one would dispute that parents and carers have a role to play, what we need to clarify is the extent of the obligations placed on companies and on every part of the internet value chain.

Can manufacturers of internet-enabled devices, perhaps especially portable devices, simply wash their hands of any and all liability for anything and everything that happens to children and young people when they use them? What about the companies engaged in providing access to the internet, whether via a fixed-line connection or via wi-fi? Then there are the online service providers, such as Google and Facebook, and online vendors such as Amazon and Tesco. What parameters are applicable to them? Where are the boundaries? This whole area has been largely neglected by scholars and the legal profession, and, I am ashamed to say, politicians.

No doubt companies have considered their position, but if they have, they have been slow to publicise their legal advisers’ views. Even if they did, it is likely that such views would take a very particular perspective.

Rushanara Ali Portrait Rushanara Ali (Bethnal Green and Bow) (Lab)
- Hansard - - - Excerpts

One of my constituents came to see me after being sexually harassed for years on Facebook. Her identity was stolen and her Facebook pages were photoshopped to damage her reputation. It took her a great deal of time to get any attention from the police or the organisation concerned—in this case, Facebook. Does my hon. Friend think that there should be greater clarity and transparency about what the process and principles should be, and what citizens and consumers can expect from the suppliers such as Facebook, and from the police? Only when a death threat was made against my constituent did the police feel that they could take action. Until that point, they had to advise her to complain to Facebook.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

The case that my hon. Friend cites is an example of exactly why I called for this debate. In that case, Facebook was not taking proper responsibility. It did not have a transparent complaints process that my hon. Friend’s constituent was able to use. It did not have a mechanism for remedying the harm that she had experienced and, frankly, the police are not up to date enough with the online world. That is not true of the whole of the police service—for example, when it comes to child abuse images, the police have quite well-developed policing strategies—but in the case of online bullying, I think they are behind the game.

The fundamental responsibility, in that case, belongs to Facebook, but the police must take more seriously the fact that things happen in the virtual world that they would not tolerate in the real world, and they must ensure that their policies and procedures function appropriately in both. We have not grown up, as it were, and ensured that we have modernised our systems, including those of the police. My big argument is with companies such as Facebook. If they were to take their responsibilities more seriously, my hon. Friend’s constituent would have been much safer, and the problem would perhaps not have got as far as requiring police action.

Some new media companies seem persistently to fail to establish clear values and procedures for handling matters, such as the one that my hon. Friend raised, that can profoundly affect individuals and wider society. In the early days of the internet, that was perhaps understandable to a degree. They were learning; we were all learning. We are, however, no longer in the early days, and now such failure looks more like negligence or lack of concern. Too often, companies seem to struggle to recover a position, by which time a great deal of damage might have been done. I want to establish a new norm, whereby we expect companies, from very early on in their lives, to have an enforceable social responsibility code, which contains a publicly declared process for dealing with objectionable or illegal content.

Tom Harris Portrait Mr Tom Harris (Glasgow South) (Lab)
- Hansard - - - Excerpts

Does my hon. Friend not accept that putting “objectionable” in with “illegal” poses a danger to freedom of expression? The two terms mean completely different things. As a party that has generally supported freedom of speech, surely we should protect the right of someone to be offended if they so wish, or to say something offensive, as long as it is not illegal. We should be careful about merging the two definitions.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

My view is that because the internet so substantially broadens the audience for material, those who are responsible for doing that must take some responsibility for the content, in a way that they are not currently prepared to do. They obviously need to do that when the content is illegal, but I will go on to argue that they should also do it when it is objectionable. They should not necessarily delete everything in the first instance, but they must have a process by which someone who wants to object can properly make a case and argue for something to be taken down. The process should be transparent and contain a right of appeal, so that the matter can be dealt with.

Our publishers in the real world take responsibility for what they publish, choosing not to publish material that they deem profoundly offensive, and YouTube is effectively a publisher. It is dodging its responsibility as an institution that broadens the audience so significantly for the material that it carries. It is pretending not to be a publisher, and that is a bit of a fraud. I will go on to deal further with the issue that my hon. Friend the Member for Glasgow South (Mr Harris) raised.

A policy should guide companies when they decide whether to take down material, and there should be a right of appeal where appropriate. I would want companies to work with groups such as the Internet Watch Foundation and the UK Council for Child Internet Safety to ensure the promotion of public safety.

I initially intended to raise this issue because of the evidence that paedophiles have been using Twitter to groom young children; Members might have seen reports on that in The Sunday Mirror. I praise the newspaper for its campaign, because it has forced Twitter to take action to protect children. However, Twitter has still not joined the Internet Watch Foundation to show its support for the wider industry’s measures to keep child abuse images off the internet as a whole. That is a shameful example of a profound disregard for the interests of British children and young people. What is worse is that when the storm broke, Twitter simply retreated into a Californian bunker. It seems to me that it cynically decided to sit out the storm, in the hope that it would blow over and people would forget about it. Well, here is the bad news: it did not.

Habbo Hotel took a similar line when Channel 4 exposed how its site was being grossly misused and was putting children in danger. This case was, in a sense, much worse, because Habbo had at least signed up to various voluntary codes of practice. The only problem was that it was not honouring them, which speaks volumes about the weakness of our so-called self-regulatory regime for the internet in the UK. Even BlackBerry, a company in my constituency that is ethical in many important ways, was found wanting when it emerged that child pornography was not being blocked by users of its handsets on any network except T-Mobile, and the same was true for adult content. Given how popular BlackBerry handsets are with kids, that was truly appalling, but I am happy to say that both matters have now been put right.

Failure to act can lead to tragedy. It is only two weeks since Tallulah Wilson killed herself after visiting suicide websites. At the time, a spokesman for the Samaritans put the need for more responsible behaviour well:

“It is important that organisations which run sites that are highly popular with young people develop responsible practices around suicide-related content, including promoting sources of support and by removing content which actively encourages or glorifies self-harm or suicide”.

Glorifying self-harm or suicide is not illegal, but it is profoundly dangerous. The new Health Minister, the hon. Member for North Norfolk (Norman Lamb), last month warned that telecommunications companies faced being regulated by the Government if they failed to block websites offering advice on suicide. It is time for the companies to act.

Then there was the unrest caused by the publication on YouTube of the provocative American-made video insulting Mohammed. It caused deaths and injuries around the world when so many people saw or heard of it.

Tom Harris Portrait Mr Tom Harris
- Hansard - - - Excerpts

I feared that the debate was heading in that direction. Can we just be absolutely clear that the deaths and injuries throughout the world were not caused by the YouTube video, obnoxious and appalling though it was? They were caused by fanatics who chose to resort to violence against innocent people. No one forced them to do that.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

My hon. Friend is right, but what happened was completely predictable. Responsible publishers choose not to publish things that are designed to provoke. I have not seen the video, but I persuaded someone in my office to, and the clear intention of the material is absolutely to provoke. It was irresponsible for YouTube to carry the video.

In its response, Google, rather like my hon. Friend, uttered pious words about free speech and the first amendment, but I would like to make some observations about that. Google is an exceptionally profitable business. It is not a charity, or an agency that can lay claim to moral or political leadership in any credible way. I say that not just because of the mounting number of times Google is being hauled, in relation to other parts of the internet, before the courts and regulators and losing. The company seems to be highly selective about the parts of the law that it wishes to observe.

Many Muslims in the UK and throughout the world—some of whom reacted in the way my hon. Friend described, and some of whom simply demonstrated peacefully outside Google’s UK headquarters—were deeply offended by the video and by YouTube’s failure to remove it, except in the two countries where the company acknowledged that there might be violent protests. I understand that YouTube has now also disabled links to the clip in at least two other countries, including India. It became clear, therefore, as the tragedy of the video unfolded, that the company did not have an absolute fixed position that it would defend to the nth degree. It was a movable feast, but it moved too slowly, and only after too many people had died, been injured or had their property destroyed. That highlights the inadequacy, or at any rate the inconsistency, of YouTube’s processes. I have looked at those processes so that I can try to advise people who have been hurt by the video, and the processes are almost deliberately opaque and make it hard for people to find any mechanism to address their hurt.

I shall not address the issues that the hon. Member for Devizes (Claire Perry) has led on in Parliament, because she wants to speak later, and I want other Members to have a chance to contribute to this debate, but I am concerned that decisions—the Muslim video is one example—appear to be taken on an ad hoc basis. A codified, publicly available system would help to show that Google—this applies to other companies, too—is serious about its responsibilities. The companies need to grow up. They are not young cowboys battling on the wilder edges of a new territory about which we know little; we now know a lot, and it is time that that was reflected in the behaviour of internet businesses.

Gregory Campbell Portrait Mr Gregory Campbell (East Londonderry) (DUP)
- Hansard - - - Excerpts

The hon. Lady is outlining the thrust of her powerful argument against the likes of Google, Facebook and Twitter, but she has not said what sanctions, if she were successful and her campaign moved to a logical conclusion, a Parliament in an individual nation state might apply that could protect the people whom she and I seek to defend.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

The hon. Gentleman is right that I have not stated the sanctions that Parliament could apply, because in this debate I am arguing, in the first place, for the industry to grow up, take responsibility and properly self-regulate, and not to say, “Oh, whoops, we are being embarrassed, so we are going to do something,” or, “Oh, whoops, it is dangerous in that country, so we will sort it there.” I am saying, “Come on; you are in the last chance saloon, and you need to take responsibility. If you do it well and right, the Minister will not need to intervene, but if you do not, I will be the first person, not just in this Chamber but in the House, arguing for much more powerful regulation.” That is not where I want to go first. I expect companies not to be surprised when they get it wrong, and to ensure that they put in place proper mechanisms to protect not just vulnerable internet users, but all of us.

My final point is about child abuse images. The Internet Watch Foundation is a model and example to the rest of the world, but it addresses only a narrow, albeit important, part of the internet—the web and newsgroups. Figures recently released by five police forces in England and Wales—Cambridgeshire, Dyfed-Powys, Humberside, Lincolnshire and Nottinghamshire—show that between 2010 and mid-2012, they seized 26 million pornographic images of children, which is an incredibly troubling number, but think about this: someone calculated that that might mean that more than 300 million images were seized across the country in the same period. Not only does that beggar belief, but it tells us that something is definitely not working as it should. Somehow or other, the industry and all of us need to up our game and confront such harm.

14:53
Claire Perry Portrait Claire Perry (Devizes) (Con)
- Hansard - - - Excerpts

I compliment the hon. Member for Slough (Fiona Mactaggart) on securing this extremely valuable debate. I know she has campaigned tirelessly on the issue and will continue to do so.

I would like to narrow the focus of the debate specifically to the internet service providers. In the UK the top six companies control and sell about 95% of access into the home, which is the place where most of the children to whom the hon. Lady refers are accessing such troubling images. Those companies generate some £3.5 billion a year through access fees, and they are, by and large, well known household names, typically with a well developed sense of corporate social responsibility.

Historically, we have had an ideological situation in which the internet has been treated differently from any other form of media. As the hon. Lady says, back when the internet was a few pony-tailed developers and was a specialist thing that we had dialling up slowly in the corner of our sitting room, that was just fine. Indeed, the light-touch regulation, or lack of regulation, and the global nature of the internet is what has made it such an extremely valuable and innovative forum. Of course, that has changed. The internet is now, arguably, one of the most mass-market forms of communication. With technological convergence, particularly with the rise of internet-enabled televisions, 3G and 4G networks and view-on-demand systems, the internet is rapidly overtaking all other forms of media as the place where many people, particularly the young, socialise and access information and news.

As the hon. Lady alludes to, although children use the internet for all those incredibly productive and wonderful things, with their extraordinary curiosity they also seek out and stumble across material that is very troubling to many. We asked a group of adults whether they are concerned about the ease of access, particularly to adult material, on the internet, and 82% said that they are extremely concerned about how easy it is to access not just pornography but websites on self-harm, suicide and bullying, which are the things we would all like to protect our children against but struggle to do so.

Why do we struggle to do so? I, of course, am a great believer in personal and family responsibility. It is my job as a mother to keep my children safe in the online and offline worlds, but I submit that the technology we have been using to do that is almost obsolete. We have been asked since the earliest days of the internet to download device-protection filters ourselves. People who live in a household like mine will have multiple internet-enabled devices, which we are supposed to protect individually. The download process can be slow, and I submit that in many households the teenage child is the web tsar and computer guru, not the parent. If the parent says, “Have we downloaded the safe search and protection filters?” big Johnny or Janie will say, “Of course, mum and dad, don’t you worry. Off you go. Don’t trouble your little heads about it.” As a result, the proportion of parents who say they have downloaded internet controls or filtering software in households with a child aged between five and 15—remember that 95% of children live in internet-enabled households—has fallen 10 percentage points over the past three years to 39%. That means that six out of 10 children potentially live in households where there is no filtering of content. Troublingly, that proportion drops even further to 33% for teenage children, so two thirds of children aged 13 to 15 live in unprotected households. We can debate for ever the rights and wrongs of that, and how it is all the responsibility of parents, but we know that 82% of parents care about this, so it is not a non-issue. The technology and the compact of responsibility have broken down.

What to do? This debate has been started many times. Indeed, the previous Government worked very hard and commissioned a number of reports, including the Byron review. They took the issue very seriously. We have moved on, but little has been done.

We tend to debate ideology. Free speech comes up frequently, and, of course, when defining pornography, one woman’s pornography is another man’s enjoyable Sunday afternoon.

Tom Harris Portrait Mr Tom Harris
- Hansard - - - Excerpts

Don’t look at me!

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

Sorry, I was not looking at the hon. Gentleman with an accusatory glance.

My point is that the debate has often been sterile, ending up with discussions of censorship. I would never like to see that, because I do not believe in censoring material; I believe in responsibility and companies signing up to an agenda.

The hon. Member for Slough and I, as many Members did on a cross-party basis, suggested a parliamentary inquiry. We took a lot of evidence and came up with the idea that an opt-in system is the best way to deliver protection. Each home would have a clean feed, using the same filtering technology as is used in device-level filters and in schools—the technology is simple and cheap—and people opt in to receive adult content. There would be choice, there would be no censorship and the material would still be available. That proposal was very popular, and almost two thirds of adults say they like the idea of opt-in technology.

I am proud to be part of a Government who have continued to take the issue seriously. The Prime Minister commissioned the Bailey review, which examined child sexualisation and child safety and resulted in the first little step forward in the internet safety debate: active choice, in which people are forced to say whether they want filters installed. To return to the big Johnny or Janie problem, how many households truly involve the adults in making that decision?

Tom Harris Portrait Mr Harris
- Hansard - - - Excerpts

An aspect of that has been raised with me. One potential problem with the opt-in system—the hon. Lady will probably be able to answer this—is that there are numerous teenagers who cannot rely on being able to speak to their parents about sensitive sexual health issues. With an opt-in filter when signing up to a new internet service provider, I am told that there would be a danger of blocking sites that give reproductive health advice. Many children cannot ask their parents about such issues—I expect about 99% cannot, now that I think of it. That could be a dangerous consequence. Has she considered that particular aspect?

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I thank the hon. Gentleman for that thoughtful intervention. Those are some of the questions that get raised: blocking sites that help children with their homework, or that concern sexual health, sexuality and other things that we know children are more comfortable talking about to friends and others on the internet than to their family.

We asked the Family Planning Association, a laudable organisation that publishes a lot of material about sexual health and guidance, and it was supportive. The FPA says that the problem right now is that children are accessing porn as a way of receiving sex education. That is not good sex education. It teaches children nothing about relationships. The FPA felt that using an age verification system—

Rushanara Ali Portrait Rushanara Ali
- Hansard - - - Excerpts

I support the hon. Lady’s proposal. It will protect young people not only from being groomed but from being radicalised on the internet; we have seen examples. It happens particularly to Muslim parents but also to others—those whose children are converts, for instance. The individual responsible for the attack on my right hon. Friend the Member for East Ham (Stephen Timms) was radicalised on the internet. We need action not just to protect children against harassment but on those kinds of issue. Anything that can address the problem would be welcome from both perspectives.

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I thank the hon. Lady for pointing out that it is not just what we might think of as pure pornography that is a problem, but many other things too. I say to both hon. Members that in the debate on this issue, we have always been in danger of letting the perfect be the enemy of the good. Filtering systems are well established. A lot of human intelligence goes into the filtering systems used by companies such as TalkTalk, which has gone furthest. It is completely possible to amend the system while ensuring that appropriate levels of material are available, just as they might be in a school environment. However, it is a worthy point.

I will continue, as I know that others are keen to speak. I was extremely proud that with the help of Members from across the House, we were able to persuade the Government to lead a formal inquiry into the opt-in proposal, led by UKCCIS. I will raise the question of Government complexity in a moment, but the inquiry had more than 3,500 responses, and I was proud to help deliver a petition with more than 115,000 signatures to No. 10 calling for an opt-in system and calling on the Government to take the issue seriously.

I think the Government do take the issue seriously, but there are many complications that must be addressed. First, as the hon. Member for Slough said, we do not have a regulator; we have a mish-mash of organisations involved in regulating the internet. In such a system, it is easy for companies to behave in an irresponsible manner or, as she mentioned in referring to a large search company, to basically make it up as they go along, with every test case being a different case. There is no clear regulation setting out a course of direction or what responsible behaviour looks like. That was one of our recommendations: give the issue to one regulator.

Secondly, there is the ideological question. It behoves us all not to have the debate about free speech versus censorship here. Of course, we must have that debate, but it is a false debate here. We are talking about children in unprotected households accessing damaging, dangerous and violent material, and we know that people are concerned about it. It is important to have a pragmatic solution rather than an ideological response.

I say not to the Minister, to whom I know it does not apply, but to others that we run in fear of the internet companies in many cases. I have asked repeatedly for evidence suggesting that an opt-in solution would be disproportionately costly or technologically impossible, or would somehow damage Britain’s internet economy, which is extremely valuable—it contributes about 8% of GDP—and is growing rapidly. Evidence there is none. It is a pence-per-1,000-users solution. It already exists, the technology is there and it is well developed. We can deal with the question of false positives and false negatives. If I ask start-up companies located at the Shoreditch roundabout, “Do you care if we have opt-in filtering on home broadband or internet provision?”—that is the most developed part of the market; only six companies offer 95% of services—they look at me as though I am mad. It has nothing to do with their business model.

I urge the Government to review the evidence. We have not yet had the evidence review session that we were promised on the inquiry. I understand that faces have changed. I would like to get it right rather than do it quickly, but also to focus as best we can, given the number of Departments involved, on the right solution to protect our children.

15:05
Yasmin Qureshi Portrait Yasmin Qureshi (Bolton South East) (Lab)
- Hansard - - - Excerpts

It is a pleasure to speak in this debate under your chairmanship, Mr Owen. I congratulate my hon. Friend the Member for Slough (Fiona Mactaggart) on securing it.

I start from where the hon. Member for Devizes (Claire Perry) stopped. Asking for self-imposed regulation of the industry does not mean that the economy of our country, the booming internet trade or what happens on the internet will suddenly come to a stop and that we as a country will somehow become less economically effective. This debate is about the fact that, as has been said, the internet reaches out to billions and billions of people around the world. Unlike what is in newspapers or on television, which may be limited to particular countries—although somebody travelling to a country might be able to see it—something posted on the internet can be seen by everyone in the world who has access to a computer.

What the internet says is therefore powerful. It is amazing that such a powerful institution or body has no regulation and no sense of responsibility for what is put on it or taken off. As has been said, a lot of internet companies act differently in different countries, so they seem to be sensitive in relation to different countries, although that sensitivity is probably based on economic rationales rather than anything else. Although economics is important, so is the internet’s effect on people.

This debate always ends up with arguments about freedom of expression and the idea that saying that there should be an element of regulation of what appears on the internet, or even in the print media or on TV, somehow curtails people’s freedom of expression. Freedom of expression has never been completely unfettered. As has been said, there have always been things that are illegal to say. Some people might say that if we want to take freedom of expression to its extreme, people should even be allowed to say things that are illegal, and that there should be no restrictions at all. However, we do have restrictions, and rightly so. There is nothing wrong with talking about objectionable material.

I will not discuss sexualisation or the effect of pornography, as the hon. Member for Devizes spoke about it in detail and it is pointless to repeat the same thing. However, I entirely agree with her about the dangers to young people, adults and others who are vulnerable, and I agree with everything that my hon. Friend the Member for Slough said.

May I say on record that I agree with self-regulation rather than a statutory framework? An awful lot is said on the internet that can harm people’s reputation, for instance. I do not see why everybody always says that people’s sensitivities should be ignored completely and that everything objectionable should be on the internet. I am sorry, but while there is freedom of expression—I know that there is no such thing as the freedom not to be offended—we must draw sensible parameters.

If I, or anyone, was to say on the internet that everyone with pink eyes should be put to death at birth, some might say, “Well, what is wrong with that? That is not too objectionable. Pink is not my favourite colour, so why not?” That is a bizarre example, but people might want to say it—in the past, people have used expressions regarding specific groups of people in the world. That would be objectionable and it might be illegal, but I do not think people should be putting things like that on the internet. If they do, there should be a mechanism for regulation. Even if material is not as extreme as saying that people with pink eyes should be put to death at birth, it is still objectionable. I do not see why there should not be a system in place to enable people to raise the issue with the companies concerned and explain why it is a problem.

We touched on the issue of the American film on YouTube. My hon. Friend the Member for Glasgow South (Mr Harris) said that this debate would end up going in that direction, but I want to address the point because a lot of people wrote to me to complain about the content of that film and said that it was objectionable. If people want to discuss a concept in any religion or culture, they should be able to write about it. Nobody is saying that there should not be a discussion or dissemination of ideas. However, when the whole intent is to provoke people, abuse people and vilify people, that cannot be right. Surely somewhere along the line common sense must come into play.

Rushanara Ali Portrait Rushanara Ali
- Hansard - - - Excerpts

Does my hon. Friend agree that it would be helpful, particularly for those who do not have power and money and are not clear about their rights, for people to be able to receive advice that is free, high-quality and accessible on some of these questions? I am not aware that such a provision exists, but perhaps the Minister could consider that as a first step, particularly to help vulnerable people—parents who worry about what their rights are and how they can be enforced—or to help put pressure, as I found in a case with my constituent, on the police to take action so that these issues do not get passed around before they become more serious. Related to that point is libel—where people’s reputations are damaged, something that I experienced myself during my election campaign. It takes a long time and many threats of legal action before libellous material posted on the walls of host sites, or sites that are libellous and wrong, is taken down. Surely the Minister could help with that.

Yasmin Qureshi Portrait Yasmin Qureshi
- Hansard - - - Excerpts

I agree with my hon. Friend. Such an example would be the famous case of Max Mosley. Even though what was written in newspapers was found to be defamatory, it continues to be published on the internet.

I was a member of the Joint Committee on Privacy and Injunctions. The managing directors of Google, Facebook and Twitter gave evidence, and the Committee explored the issue of why content that a nation state has clearly declared illegal is not removed. There were not many issues on which the members of the Committee were unanimous, but we all agreed that all three companies were just twisting and turning and not giving us direct answers. They had to be pressed hard. Initially, they said that it was technically not possible, or difficult, or expensive, or impossible to monitor. When the Committee asked more detailed questions, such as, “Do you have the technology? Is there no software available?” basically, it boiled down to the fact that they did not want to do it—it was as simple as that. It was not in their financial interests to do it. It was not in their profit-making interests to do it. It was not that they could not do it because it was so difficult; they just did not want to. We got that answer—not even then was there complete acceptance—after God knows how many questions. Eventually, there was an admission that, technically, there was no reason why they could not do it. We at least got to the bottom of that.

The Committee looked at the whole issue of regulating the internet. Everybody accepts that there are challenges—they may be technical challenges, but they certainly can be overcome if the desire and intention is there. The issue is all about saying, “We know you can do these things. Why don’t you self-regulate?” If there is content on the internet, whether via YouTube, Facebook or Twitter, that is offensive, rude or defamatory, people should not have to go through the long process of dealing with the law. Max Mosley is a rich man and is able to do so. I think he has challenged Google many times. Every time he makes a challenge, content is deleted before it eventually reappears. Most ordinary people cannot do that—they do not have the money, time or resources. There should be an internal mechanism to deal with such cases. When there is freedom of expression and people can say what they like, it is important for there to be responsibility.

I will return to the recent YouTube case. I accept that YouTube did not cause the deaths, but it is right to say that it knew it would happen. It was done deliberately to provoke, annoy, vilify and abuse. It was not done to discuss and disseminate issues and ideas. It was not done as an academic discussion about a particular aspect of a particular religion, or any particular character in any religious history. It was done purely as a form of abuse. At that point, we have to think about the level of abuse that is aimed at people, whether they are dead or alive.

Tom Harris Portrait Mr Tom Harris
- Hansard - - - Excerpts

My hon. Friend provokes me into one more intervention. She said earlier that where something on the internet is offensive, rude or defamatory there should be processes to resolve that. Offensive and rude are not remotely, and never will be, illegal. Defamatory is illegal. I ask her once again to draw that distinction. Something being offensive does not necessarily mean that anyone has to withdraw it. There were many people in our party, before the age of the internet, who were actually apologists for those who wanted to ban Salman Rushdie’s “The Satanic Verses.” That was unacceptable then and it would be unacceptable now. We have to be very careful that we do not throw the baby out with the bathwater.

Yasmin Qureshi Portrait Yasmin Qureshi
- Hansard - - - Excerpts

I am not an apologist for the Salman Rushdie issue. That was a book that was trying to discuss ideas. As my hon. Friend says, the internal rules of this country can decide whether something is illegal or defamatory. It is one thing to have a discussion about particular issues or concepts, but it is another to take that to an extreme. For example, there is an old film called “The Life of Brian”, and other films have been made about Jesus Christ. Within the Churches, there may be a number of issues—for example, homosexuality—that people would like to discuss. I do not think that anybody says that those ideas should not be discussed.

However, I have sympathy for the billions of Christians across the world. We can debate issues, but that is not the same as showing someone they revere so much in an intimate situation, when one of the aspects of the religion, or of the person’s life, was the fact that he was a gentleman who refrained from intimate relationships. Talking about it is one thing, but to depict it and show it: is that freedom of expression or a deliberate attempt to generate publicity and create loads of money? Obviously, the minute a film becomes controversial it often becomes a bestseller; but at the same time billions of people have been badly offended. Perhaps we should think about the concept of complete freedom of expression—although it has never been complete. We should think about people’s sensitivities. That does not mean talking about censorship, or saying that people cannot discuss ideas, or that there cannot be freedom of expression or discussion; but we should think about it.

More importantly, as most hon. Members who have spoken in the debate have said, there is no system to deal with the issues. If there is something on the internet that is defamatory, wrong, objectionable or offensive, people should be able to contact the companies concerned and express their views. Then the companies would at least have the chance to consider things and say, “Maybe we should take this away, and we should not have this photo online.” There is no such mechanism at the moment. It is difficult. As for YouTube, it was asked to remove material in the US, and it did. Internet companies are selective about what they choose to take off and put on, and mostly the motive, I am sorry to say, is profit. That is the ultimate goal for all of them. They are not talking about freedom of expression. Perhaps mine is a personal and old-fashioned view, but I do not think insulting and abusing people is freedom of expression. It is just downright abuse and bad manners. However, I digress.

I want to end by saying that we should have a system that is simple to follow for people who are unhappy with what is on the internet, and that the response of the internet companies should be swift as well. When something happens it should not go on for months, with the item being taken off perhaps a year down the road. By then the damage has been done. It is important to have a system that is swift, simple and cheap.

15:22
Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- Hansard - - - Excerpts

I congratulate the hon. Member for Slough (Fiona Mactaggart) on securing this important debate and this opportunity to discuss the issues. I also pay tribute to my hon. Friend the Member for Devizes (Claire Perry) for the campaign she has brought to Parliament.

I want to comment—briefly, you will be relieved to hear, Mr Owen—on one aspect of the subject: search returns. The debate opened with the hon. Member for Slough raising the issue of definitions, and making the point that the term “internet company” is no longer appropriate. The term “search engine” is not really any longer totally appropriate either. The companies in question are advertising companies. There is nothing wrong with advertising companies and agencies; we have had them for years. The challenge for public policy in this place is that that is not how people think of them. They tend to think of the giants of the web—essentially Google, Facebook and Twitter—more as utilities than advertisers or advertising media companies.

People who work in the industry like to say, “You just don’t get it. The thing is, on the internet, people are, like, looking for stuff, and we, like, help them, like, find it.” Of course, that is true, but it is tempered by commercial considerations. It is also true that in some cases they “help you, like, find stuff” that you did not actually “like, know you were, like, looking for,” through contextual and behavioural targeting. Again, there is not necessarily anything wrong with that as an advertising media technique, but it creates another challenge, which is that most people, including most public policy makers, do not understand how it works.

It might be worth reiterating briefly how search engines make money. Essentially they do it through paid placements, according to the formula PPC x CTR, which is the pay-per-click bid times the click-through rate. Of course, that applies only to a relatively limited number of search returns—usually a couple at the top of the page and some down the side. However, the number varies over time. A comparison between Google.com in the United States and Google.co.uk in this country shows that variation. Commercially, search engines have the potential to make the market work better, and therefore contribute to economic growth; but they can also add cost. That is relevant to the debate. They add it in two ways: first, through the competitive bidding, because that PPC x CTR formula contains natural in-built inflation. Secondly, in certain sectors, for a mathematical reason with which I will not detain or bore Westminster Hall today, second-tier intermediaries can be created. That is to do with—well, I had better stop there, but believe me, it happened. It happened, for example, in the travel industry in a big way.

The point for corporate social responsibility is that those same pressures also apply in areas that go far beyond the purely commercial sphere. In a good way, search engines and other players on the internet can help people in their quest to get help, but the counter-pressure also applies, which is that where money and a commercial motivation are involved, the effect can be the opposite. It can become harder for people to find the help they need.

The area that I am concerned about is debt. When it comes to chronic personal debt, the normal rules of supply and demand tend not to apply. People regularly take out loans that are not the cheapest to which they could have access, and which they cannot afford to pay back. Similarly, for people seeking help—which could be through debt consolidation, a debt management plan or just straightforward advice—the routes they end up on are often, unfortunately, not the ones that are best for them, but the first that they encounter at the point when they think they need to do something different. These days, of course, a key place to go—the first place to go, for many people—would be an online search.

The internet has improved somewhat in this regard in recent years. When people enter search terms to look for help with debt, it seems more likely now than it was even a year or two ago that the top half of the screen will show appropriate, sensible, responsible providers who can help. I do not know what is driving that. I hope that it is a commitment on the part of search engines to improve, and to make sure that people can get access to that information. The issue has been brought up in the past in this place, and I hope that some of the message has got through. However, we must be conscious that however good or bad things may be today—and they are not perfect; the first two results that come up will still be for debt management companies—there is no guarantee of their staying that way. The arena is constantly changing. Technology is constantly changing. The algorithms that drive the ads that get driven to different people are constantly becoming more sophisticated.

I would like a clear, public and ongoing commitment from the providers of search on the internet that, in relation to debt, they will both elevate and clearly mark out providers such as Citizens Advice and the Consumer Credit Counselling Service, which offer a responsible service. That approach could be extended easily to other areas where people find they are in difficulty. I do not think that we need legislation to do that, but the Government can have a role in exhorting providers to do it.

15:29
Kevin Brennan Portrait Kevin Brennan (Cardiff West) (Lab)
- Hansard - - - Excerpts

I, too, congratulate my hon. Friend the Member for Slough (Fiona Mactaggart) on securing the debate, and other hon. Members who have spoken. I have a lot of sympathy with some of the points made by the hon. Member for Devizes (Claire Perry) and hope that she is successful in persuading the Government to take action. I also agree with many of the points about advertising made by the hon. Member for East Hampshire (Damian Hinds), and with those made by my hon. Friends. My hon. Friend the Member for Glasgow South (Mr Harris), who made several interventions, has a point, with regard to our being clear about free speech, and being clear that we should always, whatever our view of something posted on the internet, condemn violence, which is never justified and certainly was not justified in the cases that we have heard about.

When I was a Minister in the Department for Children, Schools and Families in the previous Government, we took forward the Tanya Byron review on internet safety for children, which was mentioned by the hon. Member for Devizes. That was an interesting experience. I commend that report to hon. Members, because it is still relevant, even though it is a few years old. At the time, my daughter, who has just started university, was a teenager, and I thought that, as the Minister responsible, I had better look a bit closer at what she was doing online. She had been making videos and putting them on YouTube. I asked her, “Why do you do that?” She said, “I’ve got to think of my followers.” I asked what she meant and she said, “I need to be sure that my fans are getting some good videos.” I had a look, and one of the videos that she made had more than 100,000 views on YouTube. One comment underneath a video—these were Harry Potter fan videos—said, “How old are you?” She replied, “It’s not my policy to reveal my age.” That made me think, during the Tanya Byron review, that having built a swimming pool, the most important thing is not to put up a sign saying, “Danger! Deep end”, but to teach people to swim, and to have the resources to understand the medium they are dealing with, including who is at the other end of an online comment. By and large, although they can be vulnerable, children are quite savvy and intelligent. That proper level of education about the dangers on the internet is the first and strongest protection we can give, before starting to talk about what the Government can do in relation to regulation.

As several hon. Members have said in relation to responsibility, this is relatively new. The internet has emerged as the hugest, most important technological change in the past 20 years, and has changed our lives in a transformational way. It started as a wild west area, but the observations made by the hon. Member for East Hampshire are important and pertinent here, because this is essentially, overwhelmingly, a tool for carrying advertising. In relation to some of the irresponsible things that we see online, including on social media sites such as Facebook and Twitter, what drives those platforms’ existence, ultimately, is advertising. People advertising on websites are, by and large, companies—often large companies—with corporate social responsibility statements that would not tolerate their brand being associated with some of the things on the internet that we have heard about today, including the activity of trolls, child pornography, and so on.

Turning to public policy, we should hold the advertisers to account, as well as the people who provide the platform, to ensure that we are naming and shaming, and showing companies that purport to be socially responsible corporations where their advertising is appearing, and what it is appearing next to, from time to time. Ultimately, that commercial pressure will force, and is forcing, greater responsibility on to some of the newer companies, such as Facebook, which have only existed for a small number of years. That is important.

In Westminster Hall not so long ago, we debated the way that search engines, because of the algorithms that the hon. Member for East Hampshire mentioned, often throw up results at the top of the page that, say, encourage people to download a music track illegally before they are even offered the opportunity to purchase it legally online.

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

The hon. Gentleman makes some important points about the responsibility of advertisers. Will he acknowledge that a development on the internet that a lot of people do not understand is that an advertiser may not know where their advert will appear, because they give agency, effectively, to the search company to put it in context according to its algorithms, providing them with the greatest number of hits?

Kevin Brennan Portrait Kevin Brennan
- Hansard - - - Excerpts

Yes. My answer is that that is not good enough. A company that purports to be corporately socially responsible should insist on knowing where its advertising will end up, and should not just be presented with the result of an impersonal algorithm devised by an advertising company. That is not good enough and not acceptable if a company purports to be corporately socially responsible. That is my point. Companies need to be held to account for ensuring that they care about where their advertising ends up, because if they do not take any interest in that, ultimately that will do reputational damage to their brand.

I want to say a few words about internet trolls and so on. A terrible incident, which hon. Members will have heard about, happened in my constituency a week last Friday. A person drove a van deliberately at people—mainly women and children—killing one of my constituents, Karina Menzies, leaving her three children motherless, and maiming, injuring and traumatising countless others along the way. That was an awful incident. I thank all hon. Members who have expressed their sympathy for my constituents.

Of course, as we know, inevitably there are people out there online who seek to upset, provoke and offend in these cases. Some things that people say in these instances will not be illegal, as my hon. Friend the Member for Glasgow South said, but some may be actionable and illegal. Nevertheless, they are offensive and have the capacity to cause public disorder and, in some instances, as we have seen in other tragedies, to lead people to take their own lives, so upsetting is the abuse that they have suffered online. There is, in particular, a strong case to be made for social media organisations to take these matters seriously.

I want to give some small words of praise to Facebook, because after I mentioned some pages of that kind that had appeared in the wake of that incident, it took them down quickly. That is new. Its policies are in the process of being developed. As such companies reach maturity, they will understand that it is unacceptable to hide behind the defence that they simply provide a platform and what appears on it is nothing to do with them. If we were happy for people to paint defamatory or deeply offensive comments about our neighbours, or someone else, on a wall outside our house, we would have to say that we had some responsibility for that wall and what appeared on it, and a responsibility to do something about it, particularly if we were making money out of that process. There is some change, but I sense that it may have been easier for me as a Member of Parliament to contact Facebook and get that action taken than it might have been for some of my constituents.

On every Facebook profile, there is a “Like” button that people can click. Why is there not a button as prominent and clear saying not so much “Dislike”, but “Report abuse”, or whatever? That is the minimum that should be required. When I was a Minister, a social media company called Bebo was quite prominent with young people, although it is less so now—hon. Members probably remember it. It refused time and again to put a prominent button on pages for children to enable them to report abuse, creepy questions or whatever they were encountering on Bebo. That is the minimum that we should expect from these companies.

15:39
Helen Goodman Portrait Helen Goodman (Bishop Auckland) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Owen.

I congratulate my hon. Friend the Member for Slough (Fiona Mactaggart), who made an excellent speech. This is an important, timely debate. I also congratulate the hon. Member for Devizes (Claire Perry) not just on her speech, but on the excellent work she did in setting up the all-party group and undertaking the inquiry, which raised the profile of the importance of taking clear steps forward to protect children on the net.

Social media companies claim that they have policies to protect users, prevent crime and avoid bullying, but from what we have heard this afternoon, such policies are clearly failing. I, too, have examples from my constituency. A schoolgirl who recently came to see me had been bullied on Tumblr. When she complained and asked the company to deal with it, she was told that it was up to her to identify the perpetrator. Last week, the Internet Watch Foundation published research that shows that 88% of self-generated, sexually explicit online images and videos of young people are taken from their original location and uploaded on to other websites. While some young people might be getting skills, a lot of others clearly are not. Another family in my constituency came to see me. The father had been murdered and they were being bullied and abused on Facebook by the family of the offender, who is in prison. When they complained to the police, the police took no action.

My hon. Friend the Member for Lewisham East (Heidi Alexander), who is not present, tried to introduce a ten-minute rule Bill after some gangs were involved in a murder in her constituency. The perpetrating gang posted an abusive rap on YouTube and it took months to get Google to remove it. When we met its executives, they said that they had people in the UK monitoring things all the time, but they could not even tell us how many people did that work. They also seemed to be confused about whether they were operating within a British or an American legal framework. As a final example, a young constituent of my hon. Friend the Member for Darlington (Jenny Chapman) was groomed on Facebook and, unfortunately, murdered by the person who had groomed her. All such episodes, including the ones described by other hon. Members, demonstrate that the current situation must change. Ministers need to be far more energetic in tackling the problems.

What do I think we need to do? First, on free speech, of which there has been some discussion and which is a fundamental human right, it is important to remember that in this country, unlike the United States, free speech is a right with conditions and is to be exercised responsibly. Having the right to free speech is not like holding the ace of spades and being able to trump every other right, such as the right to a fair trial.

Secondly, it is worth thinking about what drives so much of the abusive behaviour on the net. We had a little kerfuffle about that last week. I believe that the cloak of anonymity allows or enables some people to behave in ways that they would not in ordinary life. I do not mean that we should all post our bank account numbers online for everyone to see, or that nicknames should be banned on Twitter, but the idea of moral responsibility requires that a person is identifiable in order to take responsibility. To assert rights, there must be a rights holder. It is therefore a worry that the private regulator of the list of websites in the United States, ICANN—the Internet Corporation for Assigned Names and Numbers—does not know the provenance of a third of its websites, and Nominet’s current consultation on how to verify registrants is helpful and something that we might be able to build on.

The Government as a whole should take the issue seriously. At the moment, we seem to be dealing first with one Minister and then with another—there does not seem to be a proper strategy. For example, in the context of the Defamation Bill, we have raised anonymity with the Minister’s colleagues in the Ministry of Justice; I hope that in the light of what he has heard this afternoon, the Minister will go to those colleagues and seek to strengthen clause 5 of the Bill. As currently drafted, it is not mandatory to include and publicise an e-mail address for complaints on open websites, and a complainant may need a court order even to pursue a case against someone who wishes to remain anonymous, which is a slow and costly process.

Thirdly, the idea of an enforceable code, suggested by my hon. Friend the Member for Slough, is extremely interesting. Abuse on the net, whether of children or adults, whether criminal or simply unpleasant, is a growing problem and the Government are failing in their duty to get to grips with it and to protect our citizens. In an Adjournment debate on 17 September on internet trolling, a Home Office Minister responded and listed some of the legislation that can be used to deal with abuse on the net. At this point, I say to my hon. Friend the Member for Glasgow South (Mr Harris) that there is a difference between being offended by someone’s views and being subject to harassment on the net, and that distinction is made in the law.

I asked the Library for a list of the pieces of legislation that can be used to tackle the problem and was told that there were seven: the Malicious Communications Act 1988, the Communications Act 2003, the Protection of Children Act 1999, the Telecommunications Act 1984, the Public Order Act 1986, the Computer Misuse Act 1990 and the Protection from Harassment Act 1997. When I looked at the relevant provisions, many seemed to overlap, so I am not clear whether they are an adequate basis for the sort of code that my hon. Friend the Member for Slough is suggesting, and they certainly present a confusing picture. I want Ministers to initiate a cross-departmental review. Currently, we have shambolic confusion and no coherent strategy from the Government.

In the absence of action by the Government, the Crown Prosecution Service is consulting on the use of the existing criminal law. The Director of Public Prosecutions said:

“Social media is a new and emerging phenomenon raising difficult issues of principle, which have to be confronted not only by prosecutors but also by others including the police, the courts and service providers. The fact that offensive remarks may not warrant a full criminal prosecution does not necessarily mean that no action should be taken. In my view, the time has come for an informed debate about the boundaries of free speech in an age of social media.”

That is an extremely helpful contribution.

The DPP’s remarks highlight another issue. New problems require new solutions, new practices and new skills, not only for the courts but for the police, social workers, teachers and medical staff. Such professions will need to adapt and modify their work and learn new techniques to ensure, for example, that e-crime is taken seriously, that court orders to offenders cover cyber-bullying or that teachers can give good advice to young people. All that is a new burden on the public purse, with special training and awareness-raising needed, for instance.

Many colleagues this afternoon have mentioned that money is an important driver, which brings us to the next area in which the social media companies need to improve their social responsibility: the paying of taxes. It is simply not acceptable that through artificial devices such as extortionate payments for licences they continue to depress profits, so Facebook, with an estimated income from advertising of £175 million in this country, paid no tax in 2011. Google, which in the US estimates its UK income to be more than £2 billion, paid only £3 million in taxes. According to the House of Commons Library, Twitter UK has not even submitted any accounts. Such firms are putting a new and costly burden on the public purse, but they are not acting as responsible corporate citizens. The Government cannot stand back and ignore that. Ministers need to ensure that Her Majesty’s Revenue and Customs uses all the weapons at its disposal and, if necessary, they need to legislate further in order to crack down on avoidance devices. I suggest to the Minister that that is as important as dealing with the regulations and the code described by my hon. Friend the Member for Slough.

15:49
Lord Vaizey of Didcot Portrait The Parliamentary Under-Secretary of State for Culture, Olympics, Media and Sport (Mr Edward Vaizey)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Owen. I congratulate the hon. Member for Slough (Fiona Mactaggart) on securing this important debate. We have had some useful contributions from hon. Members, including the hon. Member for Devizes (Claire Perry), who is well known for her campaigning to protect children from online pornography. I had the welcome experience, for the first time, of hearing the hon. Member for Bolton South East (Yasmin Qureshi), whose speech on this important subject was wide-ranging and comprehensive. The hon. Member for Cardiff West (Kevin Brennan) brought his significant ministerial experience to bear, and the hon. Member for Bethnal Green and Bow (Rushanara Ali) made a useful contribution. My hon. Friend the Member for East Hampshire (Damian Hinds) brought his significant experience of marketing to the debate.

Time is short, so I will make some points briefly. First, it tends to be a cliché uttered by Ministers and politicians alike that the internet is all-pervasive. It is worth reminding ourselves how quickly it has become all-consuming. The rise of the tablet and the smartphone means that the internet is with us almost 24 hours a day. It brings enormous economic and social value, and broadly speaking the vast majority of people who use it do so responsibly and it enhances their lives. We also know that it enables individuals to reach a wide audience with bile, bullying and bigotry.

This afternoon, I want to distinguish between what is criminal and unlawful on the internet, and what is objectionable but may not be illegal. It is important to emphasise some of the good things that are happening in the self-regulatory approach to the internet. It is worth remembering that it is not completely the wild west. It is absolutely right that hon. Members come to a debate such as this and highlight where things are going wrong and action is needed. It is also important to note that we have made progress.

The Internet Watch Foundation has been mentioned, and is a model of its kind. It was pioneered in this country, and provides unique data to law-enforcement partners in the UK and abroad to investigate distributors of child pornography, with the result that almost all images of children are now hosted abroad and not in this country. The second phase is that the IWF now works hard to ensure that exposure to such content is blocked by the provision of a dynamic list of child sex abuse web pages. It is important to remember that the IWF has made significant progress.

15:53
Sitting suspended for a Division in the House.
16:03
On resuming—
Lord Vaizey of Didcot Portrait Mr Vaizey
- Hansard - - - Excerpts

Before we were interrupted, I was talking about the important work of the IWF, which, as I said, stands as a model for self-regulation around the world for the job it does in blocking access to websites hosting absolutely pernicious material. There is unanimous praise for the work of the IWF.

The other issue that our debate has covered is defamatory material. People often say that the internet is not regulated, but it is; it is regulated by the rule of law, which applies online just as it does offline, and that would apply to defamatory material. We need to ensure that the law works effectively. The Defamation Bill, which the hon. Member for Bishop Auckland (Helen Goodman) mentioned, recently had its Second Reading in the House of Lords, and that is one such area where we are ensuring that the law applies as it should.

The Bill sets out new procedures that will facilitate the resolution of complaints directly by complainants with the author of the allegedly defamatory material, rather than with the website intermediary. We believe that that will encourage website operators to act responsibly without unfairly exposing them to liability in defamation proceedings. It will help freedom of expression by ensuring that material is not taken down without the author being given an opportunity to express their views, and importantly, it will help to enable action to be taken against authors who are responsible for making defamatory statements online. That is one example of how the law applies online, and there are others.

Moving on to what I would characterise as “grossly offensive” material, hon. Members have rightly provided truly awful examples of internet trolling. However, I am not sure that we need to create new offences and put more on to our already crowded statute book, to which the hon. Lady referred. A plethora of existing legislation is being used to prosecute offenders. For example, in September 2011, Sean Duffy was jailed for 18 weeks under the Malicious Communications Act 1998, after posting offensive messages and videos on tribute pages about young people who had died. In 2010, Colm Coss was also imprisoned for posting obscene messages on Facebook tribute sites, including that of Jade Goody and several other people. Section 127 of the existing Communications Act 2003 creates an offence of sending, or causing to be sent,

“by means of a public electronic communications network a message or other matter that is grossly offensive or of an indecent, obscene or menacing character”.

It has been established that abuse posted on social media sites, such as Facebook and Twitter, can be prosecuted under that Act and, as case law develops in that area, we will see swift action when such cases arise.

As the hon. Lady pointed out, we have not only that Act, but the Malicious Communications Act, the Computer Misuse Act 1990, the Protection from Harassment Act 1997, the Criminal Justice and Public Order Act 1994, and the Sexual Offences Act 2003, as well as the common law offence of breach of the peace. Other recent high-profile cases have involved the Olympic diver, Tom Daley, and the footballer, Fabrice Muamba. Quite rightly, the Director of Public Prosecutions is proposing to publish new guidelines in this area, which will be very helpful. We are not in the business of criminalising bad manners, unkind comments, or idiotic views, however offensive we might find them. Cases involving social media involve a difficult balancing exercise, and that is what the new guidance from the DPP will address. Those guidelines will be published for consultation at the end of November, and I hope that they will ensure that decision making in difficult cases such as those is clear and consistent.

The hon. Member for Slough mentioned the “Innocence of Muslims” film, of which there has been worldwide condemnation. President Obama said that the United States Government had nothing to do with that video and called for its message to be rejected. The Secretary of State, Hillary Clinton, also called the film “disgusting and reprehensible”. The right to freedom of opinion and expression is, as I think we would all agree, a vital component of a free, democratic society. However, with that freedom come responsibilities; particularly, the respect for the beliefs and religious convictions of others.

The right to freedom of opinion and expression is enshrined in our laws. Carefully defined and intensely debated limitations on that right exist under legislation such as the Racial and Religious Hatred Act 2006 and the Public Order Act 1986. Although there are frequent calls to ban websites and online material that carry extremist or offensive content, such content typically tends to fall short of the criminal threshold. Additionally, websites that host the film may be with internet service providers based outside the UK, and removing a website from one host may not result in it being removed from the internet permanently.

The hon. Lady rightly called for more to be done in the area of self-regulation, but again, to balance the debate, I will say that I would not characterise internet companies as flagrantly flouting their responsibilities. The power of public perception is essential to the success of these businesses. If people did not trust them and believe that they act responsibly, they would move on to new services and sites.

This Government are committed to tackling trolling, cyber-bullying and other forms of abuse and misuse of social networking sites, and we will work—

Albert Owen Portrait Albert Owen (in the Chair)
- Hansard - - - Excerpts

Order. Mr Joyce has withdrawn his debate on UK-listed mining companies. I suspend the sitting until 4.30 pm, when the final debate of the day will start.

16:10
Sitting suspended.