Kirsty Blackman
Main Page: Kirsty Blackman (Scottish National Party - Aberdeen North)(2 years, 4 months ago)
Public Bill CommitteesI beg to move, That the clause be read a Second time.
This is another attempt to place a higher bar and more requirements on regulated services that are likely to cause the most serious risks of harm. The Minister has consistently said that he is keen to consider regulating the companies and platforms that have the highest potential risk of harm more strictly than the normal regime would allow. Some of the platforms would not be category 1 on the basis that they have a small number of members, but the potential for harm—radicalisation, extremism, severe damage to people or extreme pornography—is very high.
I am not yet happy that the Minister has provided an adequate answer to the question about the regulation of the highest-risk platforms that do not meet the category 1 thresholds. If he is unwilling to accept this amendment or any of the other amendments tabled by the Opposition on this specific issue, I hope that he will give consideration to a Government amendment on Report or when the Bill goes through the House of Lords in order that this loose end can be tied up.
As I have said before—I do not want go too much over comments that I have made previously—it is reasonable for us to have a higher bar and a more strict regulation regime on specific platforms that Ofcom will easily be able to identify and that create the highest harm. Again, as I have said, this is another way of going about it. The new clause suggests that if Ofcom assesses that a service poses a very high risk of harm, it might, notwithstanding the categorisation of that service, require it to perform the children’s risk assessment duties and the safety duties protecting children. This is specifically about the children’s risk assessment.
I have previously raised concerns about not being able to accurately assess the number of child users that a service has. I am still not entirely comfortable that platforms will be able to accurately assess the number of child users they have, and therefore they might not be subject to the child user requirements, because they have underplayed or understated the number of children using their service, or because there are only a few hundred children using the service, which is surely massively concerning for the wellbeing of those few hundred children.
I hope the Minister can give us some comfort that he is not just considering what action to take, but that he will take some sort of action on Report or when the Bill proceeds through the House of Lords.
It is a pleasure to serve with you in the Chair again, Ms Rees. I rise to speak in support of new clause 27.
We have argued that the Government’s approach to categorising services fails to take account of the harms that could result from smaller services. I understand that a risk-based approach rather than a size-based approach is being considered, and that is welcome. The new clause would go some way to improving the categorisation of services as it stands. It is critical that there are ways for Ofcom to assess companies’ risk of harm to users and to place additional duties on them even when they lie outside the category to which they were initially assigned. Ofcom should be able to consult any organisation that it sees fit to consult, including user advocacy groups and civil society, in assessing whether a service poses
“a very high risk of harm”.
Following that, Ofcom should have powers to deliver the strictest duties on companies that expose adults to the most dangerous harms. That should always be proportionate to the risk of harm.
Labour supports the new clause and the arguments made by the hon. Member for Aberdeen North.
I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.
It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if
“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”
When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.
New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.
However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.
I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.
I wish to press the new clause to a Division.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
The new clause attempts to address an asymmetry in the Bill in relation to the lack of user empowerment features for child users. As far as I am aware, there is no requirement for user empowerment functions for child users in the Bill. The new clause would require that if a service has to have user empowerment features in place for adults, then
“OFCOM may require a service to provide equivalent features designed specifically for child users.”
Ofcom would be able then to provide guidance on how those user empowerment features for child users would work.
This provision is especially important for the fairly small number of platforms and providers that are very much aimed at children, and where the vast majority of users are children. We are not talking about Facebook, for example, although if Facebook did have child user empowerment, it would be a good thing. I am thinking about organisations and games such as Roblox, which is about 70% children; Fortnite, although it has quite a lot of adult users too; and Minecraft, which has significant numbers of child users. On those platforms that are aimed at children, not having a child-centred, child-focused user empowerment requirement is an oversight. It is missing from the Bill.
It is important that adults have the ability to make privacy choices about how they use sites and to make choices about some of the content that they can see on a site by navigating the user empowerment functions that exist. But it is also important for children to have that choice. I do not see why adults should be afforded that level of choice and flexibility over the way that they use platforms and the providers that they engage with, but children should not. We are not just talking here about kids who are eight: we are talking about children far older, and for whom adult-centred, adult-written user empowerment functions may not be the best option or as easy to access as ones that are specifically focused on and designed for children.
I have had a discussion with the National Society for the Prevention of Cruelty to Children about the user empowerment functions for child users. We have previously discussed the fact that complaints features have to be understandable by the users of services, so if the Minister is unwilling to accept the new clause, will he give some consideration to what happens when the provider of the platform is marketing that platform to children?
The Roblox website is entirely marketed as a platform for children. It is focused in that way, so will the Minister consider whether Ofcom should be able to require differential user empowerment functions, particularly in cases where the overwhelming majority of users are children? Also, it would not be beyond the wit of man for platforms such as Facebook to have two differential user empowerment functions based on whether somebody is under the age of 18—whether they are a child or an adult—because users tell Facebook their date of birth when signing up. We have talked a lot about age verification and the ways in which that could work.
I would appreciate it if the Minister would consider this important matter. It is something that is lacking at the moment, and we are doing our children a disservice by not providing them with the same functionality that we are providing, or requiring, for adult users.
Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.
New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.
It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.
If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.
I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.
Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create
“functionalities allowing for control over content that is encountered, especially by children”.
Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?
That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.
I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
I mentioned this in earlier consideration. The issue was raised with me by Mencap, specifically in relation to the people it represents who have learning disabilities and who have a right to access the internet just as we all do. They should be empowered to use the internet with a level of safety and be able to access complaints, to make content reports and to use the user empowerment functions. Everybody who is likely to use the platforms should be able to access and understand those functions.
Will the Minister make it clear that he expects Ofcom, when drafting guidance about the user empowerment functions and their accessibility, the content reporting and the complaints procedures, to consult people about how those things work? Will he make it clear that he hopes Ofcom will take into account the level of accessibility? This is not just about writing things in plain English—or whatever that campaign is about writing things in a way that people can understand—it is about actually speaking to groups that represent people with learning disabilities to ensure that content reporting, the empowerment functions and the complaints procedures are accessible, easy to find and easy to understand, so that people can make the complaints that they need to make and can access the internet on an equal and equitable basis.
I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.
At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.
The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.
That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.
There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.
We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.
I thank the Minister, particularly for providing the clarification that I asked for about who is likely to be consulted or taken into account when Ofcom is writing the codes of practice. Notwithstanding that, and particularly given the rather excellent speech from the shadow Minister, the hon. Member for Worsley and Eccles South, I am keen to press the new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
I drafted this new clause following a number of conversations and debates that we had in Committee about how the Act will be scrutinised. How will we see whether the Act is properly achieving what it is supposed to achieve? We know that there is currently a requirement in the Bill for a review to take place but, as has been mentioned already, that is a one-off thing; it is not a rolling update on the efficacy of the Act and whether it is achieving the duties that it is supposed to achieve.
This is particularly important because there are abilities for the Secretary of State to make changes to some of the Act. Presumably the Government would not have put that in if they did not think there was a possibility or a likelihood that changes would have to be made to the Act at some future point. The Bill is certainly not perfect, but even from the Government’s point of view it is not perfect for all time. There is a requirement for the Act to be updated; it will have to change. New priority harms may have to be added. New details about different illegal acts may have to be added to the duties. That flexibility is given, and the Secretary of State has that flexibility in a number of cases.
If the Act were just going to be a standing thing, if it were not going to be updated, it would never be future-proof; it would never work in the changing world that we have. We know that this legislation has taken a very long time to get here. We have been sadly lacking in significant regulation in the online world for more than 20 years, certainly. For a very long time we have not had this. Now that the Act is here—or it will be once the Bill passes through both Houses of Parliament—we want it to work.
That is the point of every amendment we have tabled: we are trying to make the Bill better so that it works and can keep people as safe as possible. At the moment, we do not know how safe the internet will be as a result of the Bill. Even once it begins to be implemented, we will not have enough information on the improvements it has created to be able to say, “Actually, this was a world-leading piece of legislation.”
It may be that the digital regulation committee that I am suggesting in this new clause has a look regularly at the implementation of the Bill going forward and says, “Yep, that’s brilliant.” The committee might look at the implementation and the increasing time we spend online, with all the harms that can come with that, and says, “Actually, you need to tweak that a bit” or, “That is not quite fulfilling what it was intended to.” The committee might also say, “This brand new technology has come in and it is not entirely covered by the Act as it is being implemented.” A digital regulation committee was proposed by the Joint Committee, I think, to scrutinise implementation of the legislation.
The Government will say that they will review—they always do. I have been in so many Delegated Legislation Committees that involve the Treasury and the Government saying, “Yes, we keep everything under review—we always review everything.” That line is used in so many of these Committees, but it is just not true. In January I asked the Department for Digital, Culture, Media and Sport
“how many and what proportion of (a) primary and (b) secondary legislation sponsored by (i) their Department…has undergone a post legislative review”.
It was a written question I put to a number of Departments including DCMS. The reply I got from the Minister here was:
“The number of post legislative reviews the Department has undertaken on primary and secondary legislation in each of the last five years is not held within the Department.”
The Government do not even know how many pieces of primary or secondary legislation they have reviewed. They cannot tell us that all of them have been reviewed. Presumably, if they could tell us that all of them have been reviewed, the answer to my written question would have been, “All of them.” I have a list of the number they sponsored. It was six in 2021, for example. If the Department had reviewed the implementation of all those pieces of legislation, I would expect it to be shouting that from the rooftops in response to a written question. It should be saying, “Yes, we are wonderful. We have reviewed all these and found that most of them are working exactly as we intended them to.”
I do not have faith in the Government or in DCMS—nor pretty much in any Government Department. I do not have faith in their ability or intention to adequately and effectively review the implementation of this legislation, to ensure that the review is done timeously and sent to the Digital, Culture, Media and Sport Committee, or to ensure those proper processes that are supposed to be in place are actually in place and that the Bill is working.
It is unfortunate for the Minister that he sent me that reply earlier in the year, but I only asked the question because I was aware of the significant lack of work the Government are doing on reviewing whether or not legislation has achieved its desired effect, including whether it has cost the amount of money they said it would, whether it has kept the amount of people safe that they said it would, and that it has done what it needs to do.
I have a lack of faith in the Government generally, but specifically on this issue because of the shifting nature of the internet. This is not to take away from the DCMS Committee, but I have sat on a number of Select Committees and know that they are very busy—they have a huge amount of things to scrutinise. This would not stop them scrutinising this Act and taking action to look at whether it is working. It would give an additional line of scrutiny, transparency and defence, in order to ensure that this world-leading legislation is actually world-leading and keeps people safe in the way it is intended to.
It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.
Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.
I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.
I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.
On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.
That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.
On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.
Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.
First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.
One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:
“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”
I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.
That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.
A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.
The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.
The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 38
Adults’ risk assessment duties
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.
(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.
(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).
(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).
(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;
(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;
(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;
(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;
(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;
(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);
(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.
(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)
This new clause applies adults’ risk assessment duties to pornographic sites.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
I will take this opportunity, as my hon. Friend has done, to add a few words of thanks. She has already thanked all the people in this place who we should be thanking, including the Clerks, who have done a remarkable job over the course of our deliberations with advice, drafting, and support to the Chair. I also thank the stakeholder organisations. This Bill is uniquely one in which the stakeholders—the children’s charities and all those other organisations—have played an incredible part. I know from meetings that they have already advertised that those organisations will continue playing that part over the coming weeks, up until Report. It has been fantastic.
Finally, I will mention two people who have done a remarkable amount of work: my researcher Iona and my hon. Friend’s researcher Freddie, who have done a huge amount to help us prepare speaking notes. It is a big task, because this is a complex Bill. I add my thanks to you, Ms Rees, for the way you have chaired this Committee. Please thank Sir Roger on our behalf as well.
Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.
I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.
Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.
I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?
Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.
The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.
I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.
I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.
The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.
I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.
I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.
I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.
Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.