Internet-based Media Companies Debate

Full Debate: Read Full Debate

Tom Harris

Main Page: Tom Harris (Labour - Glasgow South)

Internet-based Media Companies

Tom Harris Excerpts
Wednesday 31st October 2012

(12 years ago)

Westminster Hall
Read Full debate Read Hansard Text

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

The case that my hon. Friend cites is an example of exactly why I called for this debate. In that case, Facebook was not taking proper responsibility. It did not have a transparent complaints process that my hon. Friend’s constituent was able to use. It did not have a mechanism for remedying the harm that she had experienced and, frankly, the police are not up to date enough with the online world. That is not true of the whole of the police service—for example, when it comes to child abuse images, the police have quite well-developed policing strategies—but in the case of online bullying, I think they are behind the game.

The fundamental responsibility, in that case, belongs to Facebook, but the police must take more seriously the fact that things happen in the virtual world that they would not tolerate in the real world, and they must ensure that their policies and procedures function appropriately in both. We have not grown up, as it were, and ensured that we have modernised our systems, including those of the police. My big argument is with companies such as Facebook. If they were to take their responsibilities more seriously, my hon. Friend’s constituent would have been much safer, and the problem would perhaps not have got as far as requiring police action.

Some new media companies seem persistently to fail to establish clear values and procedures for handling matters, such as the one that my hon. Friend raised, that can profoundly affect individuals and wider society. In the early days of the internet, that was perhaps understandable to a degree. They were learning; we were all learning. We are, however, no longer in the early days, and now such failure looks more like negligence or lack of concern. Too often, companies seem to struggle to recover a position, by which time a great deal of damage might have been done. I want to establish a new norm, whereby we expect companies, from very early on in their lives, to have an enforceable social responsibility code, which contains a publicly declared process for dealing with objectionable or illegal content.

Tom Harris Portrait Mr Tom Harris (Glasgow South) (Lab)
- Hansard - -

Does my hon. Friend not accept that putting “objectionable” in with “illegal” poses a danger to freedom of expression? The two terms mean completely different things. As a party that has generally supported freedom of speech, surely we should protect the right of someone to be offended if they so wish, or to say something offensive, as long as it is not illegal. We should be careful about merging the two definitions.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

My view is that because the internet so substantially broadens the audience for material, those who are responsible for doing that must take some responsibility for the content, in a way that they are not currently prepared to do. They obviously need to do that when the content is illegal, but I will go on to argue that they should also do it when it is objectionable. They should not necessarily delete everything in the first instance, but they must have a process by which someone who wants to object can properly make a case and argue for something to be taken down. The process should be transparent and contain a right of appeal, so that the matter can be dealt with.

Our publishers in the real world take responsibility for what they publish, choosing not to publish material that they deem profoundly offensive, and YouTube is effectively a publisher. It is dodging its responsibility as an institution that broadens the audience so significantly for the material that it carries. It is pretending not to be a publisher, and that is a bit of a fraud. I will go on to deal further with the issue that my hon. Friend the Member for Glasgow South (Mr Harris) raised.

A policy should guide companies when they decide whether to take down material, and there should be a right of appeal where appropriate. I would want companies to work with groups such as the Internet Watch Foundation and the UK Council for Child Internet Safety to ensure the promotion of public safety.

I initially intended to raise this issue because of the evidence that paedophiles have been using Twitter to groom young children; Members might have seen reports on that in The Sunday Mirror. I praise the newspaper for its campaign, because it has forced Twitter to take action to protect children. However, Twitter has still not joined the Internet Watch Foundation to show its support for the wider industry’s measures to keep child abuse images off the internet as a whole. That is a shameful example of a profound disregard for the interests of British children and young people. What is worse is that when the storm broke, Twitter simply retreated into a Californian bunker. It seems to me that it cynically decided to sit out the storm, in the hope that it would blow over and people would forget about it. Well, here is the bad news: it did not.

Habbo Hotel took a similar line when Channel 4 exposed how its site was being grossly misused and was putting children in danger. This case was, in a sense, much worse, because Habbo had at least signed up to various voluntary codes of practice. The only problem was that it was not honouring them, which speaks volumes about the weakness of our so-called self-regulatory regime for the internet in the UK. Even BlackBerry, a company in my constituency that is ethical in many important ways, was found wanting when it emerged that child pornography was not being blocked by users of its handsets on any network except T-Mobile, and the same was true for adult content. Given how popular BlackBerry handsets are with kids, that was truly appalling, but I am happy to say that both matters have now been put right.

Failure to act can lead to tragedy. It is only two weeks since Tallulah Wilson killed herself after visiting suicide websites. At the time, a spokesman for the Samaritans put the need for more responsible behaviour well:

“It is important that organisations which run sites that are highly popular with young people develop responsible practices around suicide-related content, including promoting sources of support and by removing content which actively encourages or glorifies self-harm or suicide”.

Glorifying self-harm or suicide is not illegal, but it is profoundly dangerous. The new Health Minister, the hon. Member for North Norfolk (Norman Lamb), last month warned that telecommunications companies faced being regulated by the Government if they failed to block websites offering advice on suicide. It is time for the companies to act.

Then there was the unrest caused by the publication on YouTube of the provocative American-made video insulting Mohammed. It caused deaths and injuries around the world when so many people saw or heard of it.

Tom Harris Portrait Mr Tom Harris
- Hansard - -

I feared that the debate was heading in that direction. Can we just be absolutely clear that the deaths and injuries throughout the world were not caused by the YouTube video, obnoxious and appalling though it was? They were caused by fanatics who chose to resort to violence against innocent people. No one forced them to do that.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

My hon. Friend is right, but what happened was completely predictable. Responsible publishers choose not to publish things that are designed to provoke. I have not seen the video, but I persuaded someone in my office to, and the clear intention of the material is absolutely to provoke. It was irresponsible for YouTube to carry the video.

In its response, Google, rather like my hon. Friend, uttered pious words about free speech and the first amendment, but I would like to make some observations about that. Google is an exceptionally profitable business. It is not a charity, or an agency that can lay claim to moral or political leadership in any credible way. I say that not just because of the mounting number of times Google is being hauled, in relation to other parts of the internet, before the courts and regulators and losing. The company seems to be highly selective about the parts of the law that it wishes to observe.

Many Muslims in the UK and throughout the world—some of whom reacted in the way my hon. Friend described, and some of whom simply demonstrated peacefully outside Google’s UK headquarters—were deeply offended by the video and by YouTube’s failure to remove it, except in the two countries where the company acknowledged that there might be violent protests. I understand that YouTube has now also disabled links to the clip in at least two other countries, including India. It became clear, therefore, as the tragedy of the video unfolded, that the company did not have an absolute fixed position that it would defend to the nth degree. It was a movable feast, but it moved too slowly, and only after too many people had died, been injured or had their property destroyed. That highlights the inadequacy, or at any rate the inconsistency, of YouTube’s processes. I have looked at those processes so that I can try to advise people who have been hurt by the video, and the processes are almost deliberately opaque and make it hard for people to find any mechanism to address their hurt.

I shall not address the issues that the hon. Member for Devizes (Claire Perry) has led on in Parliament, because she wants to speak later, and I want other Members to have a chance to contribute to this debate, but I am concerned that decisions—the Muslim video is one example—appear to be taken on an ad hoc basis. A codified, publicly available system would help to show that Google—this applies to other companies, too—is serious about its responsibilities. The companies need to grow up. They are not young cowboys battling on the wilder edges of a new territory about which we know little; we now know a lot, and it is time that that was reflected in the behaviour of internet businesses.

--- Later in debate ---
Claire Perry Portrait Claire Perry (Devizes) (Con)
- Hansard - - - Excerpts

I compliment the hon. Member for Slough (Fiona Mactaggart) on securing this extremely valuable debate. I know she has campaigned tirelessly on the issue and will continue to do so.

I would like to narrow the focus of the debate specifically to the internet service providers. In the UK the top six companies control and sell about 95% of access into the home, which is the place where most of the children to whom the hon. Lady refers are accessing such troubling images. Those companies generate some £3.5 billion a year through access fees, and they are, by and large, well known household names, typically with a well developed sense of corporate social responsibility.

Historically, we have had an ideological situation in which the internet has been treated differently from any other form of media. As the hon. Lady says, back when the internet was a few pony-tailed developers and was a specialist thing that we had dialling up slowly in the corner of our sitting room, that was just fine. Indeed, the light-touch regulation, or lack of regulation, and the global nature of the internet is what has made it such an extremely valuable and innovative forum. Of course, that has changed. The internet is now, arguably, one of the most mass-market forms of communication. With technological convergence, particularly with the rise of internet-enabled televisions, 3G and 4G networks and view-on-demand systems, the internet is rapidly overtaking all other forms of media as the place where many people, particularly the young, socialise and access information and news.

As the hon. Lady alludes to, although children use the internet for all those incredibly productive and wonderful things, with their extraordinary curiosity they also seek out and stumble across material that is very troubling to many. We asked a group of adults whether they are concerned about the ease of access, particularly to adult material, on the internet, and 82% said that they are extremely concerned about how easy it is to access not just pornography but websites on self-harm, suicide and bullying, which are the things we would all like to protect our children against but struggle to do so.

Why do we struggle to do so? I, of course, am a great believer in personal and family responsibility. It is my job as a mother to keep my children safe in the online and offline worlds, but I submit that the technology we have been using to do that is almost obsolete. We have been asked since the earliest days of the internet to download device-protection filters ourselves. People who live in a household like mine will have multiple internet-enabled devices, which we are supposed to protect individually. The download process can be slow, and I submit that in many households the teenage child is the web tsar and computer guru, not the parent. If the parent says, “Have we downloaded the safe search and protection filters?” big Johnny or Janie will say, “Of course, mum and dad, don’t you worry. Off you go. Don’t trouble your little heads about it.” As a result, the proportion of parents who say they have downloaded internet controls or filtering software in households with a child aged between five and 15—remember that 95% of children live in internet-enabled households—has fallen 10 percentage points over the past three years to 39%. That means that six out of 10 children potentially live in households where there is no filtering of content. Troublingly, that proportion drops even further to 33% for teenage children, so two thirds of children aged 13 to 15 live in unprotected households. We can debate for ever the rights and wrongs of that, and how it is all the responsibility of parents, but we know that 82% of parents care about this, so it is not a non-issue. The technology and the compact of responsibility have broken down.

What to do? This debate has been started many times. Indeed, the previous Government worked very hard and commissioned a number of reports, including the Byron review. They took the issue very seriously. We have moved on, but little has been done.

We tend to debate ideology. Free speech comes up frequently, and, of course, when defining pornography, one woman’s pornography is another man’s enjoyable Sunday afternoon.

Tom Harris Portrait Mr Tom Harris
- Hansard - -

Don’t look at me!

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

Sorry, I was not looking at the hon. Gentleman with an accusatory glance.

My point is that the debate has often been sterile, ending up with discussions of censorship. I would never like to see that, because I do not believe in censoring material; I believe in responsibility and companies signing up to an agenda.

The hon. Member for Slough and I, as many Members did on a cross-party basis, suggested a parliamentary inquiry. We took a lot of evidence and came up with the idea that an opt-in system is the best way to deliver protection. Each home would have a clean feed, using the same filtering technology as is used in device-level filters and in schools—the technology is simple and cheap—and people opt in to receive adult content. There would be choice, there would be no censorship and the material would still be available. That proposal was very popular, and almost two thirds of adults say they like the idea of opt-in technology.

I am proud to be part of a Government who have continued to take the issue seriously. The Prime Minister commissioned the Bailey review, which examined child sexualisation and child safety and resulted in the first little step forward in the internet safety debate: active choice, in which people are forced to say whether they want filters installed. To return to the big Johnny or Janie problem, how many households truly involve the adults in making that decision?

Tom Harris Portrait Mr Harris
- Hansard - -

An aspect of that has been raised with me. One potential problem with the opt-in system—the hon. Lady will probably be able to answer this—is that there are numerous teenagers who cannot rely on being able to speak to their parents about sensitive sexual health issues. With an opt-in filter when signing up to a new internet service provider, I am told that there would be a danger of blocking sites that give reproductive health advice. Many children cannot ask their parents about such issues—I expect about 99% cannot, now that I think of it. That could be a dangerous consequence. Has she considered that particular aspect?

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I thank the hon. Gentleman for that thoughtful intervention. Those are some of the questions that get raised: blocking sites that help children with their homework, or that concern sexual health, sexuality and other things that we know children are more comfortable talking about to friends and others on the internet than to their family.

We asked the Family Planning Association, a laudable organisation that publishes a lot of material about sexual health and guidance, and it was supportive. The FPA says that the problem right now is that children are accessing porn as a way of receiving sex education. That is not good sex education. It teaches children nothing about relationships. The FPA felt that using an age verification system—

--- Later in debate ---
Yasmin Qureshi Portrait Yasmin Qureshi
- Hansard - - - Excerpts

I agree with my hon. Friend. Such an example would be the famous case of Max Mosley. Even though what was written in newspapers was found to be defamatory, it continues to be published on the internet.

I was a member of the Joint Committee on Privacy and Injunctions. The managing directors of Google, Facebook and Twitter gave evidence, and the Committee explored the issue of why content that a nation state has clearly declared illegal is not removed. There were not many issues on which the members of the Committee were unanimous, but we all agreed that all three companies were just twisting and turning and not giving us direct answers. They had to be pressed hard. Initially, they said that it was technically not possible, or difficult, or expensive, or impossible to monitor. When the Committee asked more detailed questions, such as, “Do you have the technology? Is there no software available?” basically, it boiled down to the fact that they did not want to do it—it was as simple as that. It was not in their financial interests to do it. It was not in their profit-making interests to do it. It was not that they could not do it because it was so difficult; they just did not want to. We got that answer—not even then was there complete acceptance—after God knows how many questions. Eventually, there was an admission that, technically, there was no reason why they could not do it. We at least got to the bottom of that.

The Committee looked at the whole issue of regulating the internet. Everybody accepts that there are challenges—they may be technical challenges, but they certainly can be overcome if the desire and intention is there. The issue is all about saying, “We know you can do these things. Why don’t you self-regulate?” If there is content on the internet, whether via YouTube, Facebook or Twitter, that is offensive, rude or defamatory, people should not have to go through the long process of dealing with the law. Max Mosley is a rich man and is able to do so. I think he has challenged Google many times. Every time he makes a challenge, content is deleted before it eventually reappears. Most ordinary people cannot do that—they do not have the money, time or resources. There should be an internal mechanism to deal with such cases. When there is freedom of expression and people can say what they like, it is important for there to be responsibility.

I will return to the recent YouTube case. I accept that YouTube did not cause the deaths, but it is right to say that it knew it would happen. It was done deliberately to provoke, annoy, vilify and abuse. It was not done to discuss and disseminate issues and ideas. It was not done as an academic discussion about a particular aspect of a particular religion, or any particular character in any religious history. It was done purely as a form of abuse. At that point, we have to think about the level of abuse that is aimed at people, whether they are dead or alive.

Tom Harris Portrait Mr Tom Harris
- Hansard - -

My hon. Friend provokes me into one more intervention. She said earlier that where something on the internet is offensive, rude or defamatory there should be processes to resolve that. Offensive and rude are not remotely, and never will be, illegal. Defamatory is illegal. I ask her once again to draw that distinction. Something being offensive does not necessarily mean that anyone has to withdraw it. There were many people in our party, before the age of the internet, who were actually apologists for those who wanted to ban Salman Rushdie’s “The Satanic Verses.” That was unacceptable then and it would be unacceptable now. We have to be very careful that we do not throw the baby out with the bathwater.

Yasmin Qureshi Portrait Yasmin Qureshi
- Hansard - - - Excerpts

I am not an apologist for the Salman Rushdie issue. That was a book that was trying to discuss ideas. As my hon. Friend says, the internal rules of this country can decide whether something is illegal or defamatory. It is one thing to have a discussion about particular issues or concepts, but it is another to take that to an extreme. For example, there is an old film called “The Life of Brian”, and other films have been made about Jesus Christ. Within the Churches, there may be a number of issues—for example, homosexuality—that people would like to discuss. I do not think that anybody says that those ideas should not be discussed.

However, I have sympathy for the billions of Christians across the world. We can debate issues, but that is not the same as showing someone they revere so much in an intimate situation, when one of the aspects of the religion, or of the person’s life, was the fact that he was a gentleman who refrained from intimate relationships. Talking about it is one thing, but to depict it and show it: is that freedom of expression or a deliberate attempt to generate publicity and create loads of money? Obviously, the minute a film becomes controversial it often becomes a bestseller; but at the same time billions of people have been badly offended. Perhaps we should think about the concept of complete freedom of expression—although it has never been complete. We should think about people’s sensitivities. That does not mean talking about censorship, or saying that people cannot discuss ideas, or that there cannot be freedom of expression or discussion; but we should think about it.

More importantly, as most hon. Members who have spoken in the debate have said, there is no system to deal with the issues. If there is something on the internet that is defamatory, wrong, objectionable or offensive, people should be able to contact the companies concerned and express their views. Then the companies would at least have the chance to consider things and say, “Maybe we should take this away, and we should not have this photo online.” There is no such mechanism at the moment. It is difficult. As for YouTube, it was asked to remove material in the US, and it did. Internet companies are selective about what they choose to take off and put on, and mostly the motive, I am sorry to say, is profit. That is the ultimate goal for all of them. They are not talking about freedom of expression. Perhaps mine is a personal and old-fashioned view, but I do not think insulting and abusing people is freedom of expression. It is just downright abuse and bad manners. However, I digress.

I want to end by saying that we should have a system that is simple to follow for people who are unhappy with what is on the internet, and that the response of the internet companies should be swift as well. When something happens it should not go on for months, with the item being taken off perhaps a year down the road. By then the damage has been done. It is important to have a system that is swift, simple and cheap.