(10 years ago)
Lords ChamberI am grateful to the Lord Chairman for allowing me to collect my thoughts on this amendment while he was going through those other amendments. The purpose of this amendment, which is rather different from that of the previous one, is to create a requirement for an internet service provider that provides a facility for the storage of digital content to consider—no more than that—whether and to what extent that facility might be open to abuse by the storage of indecent images of children. Where the service provider,
“considers that there is a material risk … they must take such reasonable steps as might mitigate, reduce, eliminate or … disrupt”,
such actions.
The context of the amendment is the fact that there are tools available to internet service providers to find out whether such indecent material is contained on their systems. As I am sure noble Lords are aware, images are reduced to digital content as a series of zeroes and ones, so even a very complex image, whether pornographic or otherwise, is simply reduced to a series of zeroes and ones. Most abuse photographs are circulated and recirculated. Many of them are known to the law enforcement authorities, and it is possible for those authorities to search for identical images, so that they know whether a particular image has appeared before, and in what circumstances.
However, I am told that increasingly, abusers are making tiny changes to images—sometimes no more than one pixel—so that the images are not identical, and are not picked up in the same way by those methods. However, I understand that Microsoft has developed a system called PhotoDNA, which it is making available free to providers. This converts images into greyscale and breaks the greyscale image down into a grid. Then each individual square on the grid is given what is called a histogram of intensity gradients; essentially, that decides how grey each square is. The signature based on those values provides a hash value, as they call it, which is unique to that particular image—I appreciate that these are technical terms, and until I started looking into this I did not know about them either. This technique allows people to identify images that are essentially the same.
Until now, the way to identify which images are essentially the same is that some poor police officer or analyst has had to look at all the images concerned. But it is now possible to do that automatically. Because the technology can operate in a robust fashion, it can identify what images are appearing, and whether they are essentially the same. It is not possible to recreate the image concerned from that PhotoDNA signature; it is only possible to scan systems or databases for signature matches. What is more, because the data for each signature are so small, the technology can scan a large volume of images extremely quickly. Apparently there is a 98% recognition rate.
I have gone through that in some detail simply to illustrate that there are such techniques available. I believe that Google is working on something—which would, of course, have to be bigger and more complex than what has been produced by Microsoft—which will do the same for videos. It will then be possible to identify similar videos in the same fashion.
The benefit of these techniques is that they make it possible for ISPs to trawl their entire database—to trawl what people are storing online and to identify whether some of the previously known indecent images are in the system. They will then be able to see whether there is a package, or a pattern, and whether particular users are storing more than others. That then gives them the opportunity to raise that issue with law enforcement officials or take disruptive action, perhaps by withdrawing service from that user.
The benefits of the specific technology are that humans do not have to scan the individual images. A number of noble Lords have seen the suites used by CEOP or New Scotland Yard whereby a row of police officers sit viewing indecent images of child pornography, which is distressing for those officers and possibly harmful to them in the long term. That does not need to happen in this case. The service providers do not have to store the images that they are matching to carry out this exercise because all they are storing are the DNA hash values of the images concerned, and they are therefore not exposing themselves to potential charges as far as that is concerned. The technology makes this comparatively easy and simple to do and does not involve a great deal of data. It also means that the service providers are not interfering in any way with the privacy of their users other than to check, in this anonymised way where they do not view the images, that no images contained there are of known child pornography.
The purpose of this amendment is to place an obligation on service providers to make use of these technologies as they are developed. Some providers already do this and are willing to do this. I think that Facebook has quite a good record as far as this is concerned. However, the amendment would place an obligation on all of them to consider whether they should use these techniques. As I say, in this instance Microsoft is making the technology and the system available free to providers.
Before the noble Baroness, Lady Hamwee, goes through whatever drafting faults the amendment may contain, I should point out why I think it is important. In our discussions just three months ago on the DRIPA legislation it was suggested that one of the reasons why the relevant changes were being made was to provide service providers with legal cover against legal challenge in other countries in which people asked why they were allowing law enforcement officials to do these things. The amendment would provide some legal cover for those service providers—in exactly the same way as the DRIPA legislation does—against challenges that this measure somehow infringes the freedom of speech of people who want to store pornographic images of children. The purpose of this amendment is to require service providers to consider whether or not they might be at risk of this misuse and then to take appropriate reasonable steps using the best available techniques to,
“mitigate, reduce, eliminate or … disrupt”,
it. I beg to move.
My Lords, I rise briefly to speak in support of Amendment 47 of the noble Lord, Lord Harris. Some may take the view that internet service providers cannot be held responsible for information that people use them to hold. Although, in my view, ISPs certainly do not have responsibility for generating content, they do, however, play a very important role in facilitating it: first, in the sense that storage protects the material in question and thereby helps to guarantee its continued existence; and, secondly, in the sense of providing a basis from which the said material may be transmitted. In so doing, they have a responsibility actively to take all reasonable steps to ensure, on an ongoing basis, that they are not facilitating the storage and/or transmission of material of the kind set out in subsection (1) of the clause proposed in the amendment.
For myself, I would also like ISPs to have to demonstrate that these active steps have indeed been taken, and are being taken, on an ongoing basis. We must foster a legislative framework that exhibits zero tolerance of all aspects of child sex abuse images, including ISPs facilitating the storage and/or transmission of such images. I very much look forward to listening to what the Minister has to say in his response to this important amendment.