All 8 Debates between Paul Scully and Damian Collins

Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments
Tue 17th Jan 2023
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Mon 5th Dec 2022
Wed 24th Feb 2021

Online Safety Bill

Debate between Paul Scully and Damian Collins
Paul Scully Portrait Paul Scully
- Hansard - -

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.

Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.

We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.

We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.

The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.

The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.

My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.

On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.

My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.

Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.

The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.

The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.

Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.

Paul Scully Portrait Paul Scully
- Hansard - -

I do not think I need to respond to that, but it goes to show does it not?

My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.

The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.

My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.

Online Safety Bill

Debate between Paul Scully and Damian Collins
Paul Scully Portrait Paul Scully
- Hansard - -

For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?

Paul Scully Portrait Paul Scully
- Hansard - -

We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—

ONLINE SAFETY BILL (Third sitting)

Debate between Paul Scully and Damian Collins
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - -

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.

Amendment 48 agreed to.

Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert

“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)

This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

Clause 82, as amended, ordered to stand part of the Bill.

Schedule 11

Categories of regulated user-to-user services and regulated search services: regulations

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?

Paul Scully Portrait Paul Scully
- Hansard - -

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

ONLINE SAFETY BILL (First sitting)

Debate between Paul Scully and Damian Collins
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - -

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults”

for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

ONLINE SAFETY BILL (Second sitting)

Debate between Paul Scully and Damian Collins
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - -

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.

Paul Scully Portrait Paul Scully
- Hansard - -

I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.

I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.

This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.

Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.

New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.

These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.

The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.

New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.

New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.

The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.

Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.

Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.

Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - -

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

Online Safety Bill

Debate between Paul Scully and Damian Collins
Paul Scully Portrait Paul Scully
- Hansard - -

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

Economic Crime: Planned Government Bill

Debate between Paul Scully and Damian Collins
Wednesday 26th January 2022

(2 years, 10 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Paul Scully Portrait Paul Scully
- View Speech - Hansard - -

I come back to this point: there is no reluctance to act. What I cannot do is pre-empt Her Majesty. Our appetite, as I say, remains undiminished. It is just a shame that the right hon. Gentleman hides behind Intelligence and Security Committee papers to throw political accusations when what we are trying to do is make sure that the taxpayers of this country get value for money and are not losing money, that the number of victims of economic crime is reduced and that they get their recoveries. Let us not make it a party political issue.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - - - Excerpts

Does my hon. Friend agree with the recommendation of the Joint Committee on the Draft Online Safety Bill that online platforms such as Facebook should not be allowed to profit from the advertising of known frauds and scams? As part of the online safety regime, they should be required to proactively block and withdraw advertising that promotes known frauds and scams.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - -

We are really aware of the issues and we appreciate the comments in that report. As that Bill progresses, we will consider them with all due process.

Uber: Supreme Court Ruling

Debate between Paul Scully and Damian Collins
Wednesday 24th February 2021

(3 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Paul Scully Portrait Paul Scully
- Hansard - -

The Supreme Court ruling is final. We recognise the concerns about employment status and the potential for exploitation. We want to make it easier for individuals and businesses to understand what rights and tax obligations apply to them, and we are currently considering options to improve clarity around employment status. I have previously talked about the fact that ACAS was charged with considering fire and rehire and gathering evidence, and it has done so. It reported back to BEIS, and we will consider what it found.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con) [V]
- Hansard - - - Excerpts

This is a landmark ruling by the Supreme Court, but many people will be concerned that companies such as Uber should not be left to interpret what it means, because otherwise we will see a disparity between the different companies employing workers in the gig economy—for example, a deliverer for Just Eat is an employee, but one for Deliveroo is not. Will my hon. Friend give serious consideration to the Government legislating to create a level playing field and to stop these abuses?

Paul Scully Portrait Paul Scully
- Hansard - -

As I have said, we will look at employment conditions and ensure that employees can understand their status and tax payment conditions. There is a complication, in that the companies my hon. Friend mentioned each have different contracts, so it is important that we have something that looks at all those things in the round.