All 7 Debates between Charlotte Nichols and Paul Scully

Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Mon 5th Dec 2022

ONLINE SAFETY BILL (First sitting)

Debate between Charlotte Nichols and Paul Scully
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.

Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.

While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.

Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.

To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.

The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.

ONLINE SAFETY BILL (Second sitting)

Debate between Charlotte Nichols and Paul Scully
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.

We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.

--- Later in debate ---
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.

The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.

Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.

We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.

Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.

Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.

Question put and agreed to.

Clause 65 accordingly ordered to stand part of the Bill.

Schedule 8

Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services

Amendments made: 61, in schedule 8, page 203, line 13, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 62, in schedule 8, page 203, line 15, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 63, in schedule 8, page 203, line 17, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 65, in schedule 8, page 203, line 25, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 66, in schedule 8, page 203, line 29, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 67, in schedule 8, page 203, line 41, at end insert—

“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.

Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert

“or content that is harmful to children—”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert

“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)

This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).

Online Safety Bill

Debate between Charlotte Nichols and Paul Scully
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - -

Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.

The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.

We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - -

Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.

Oral Answers to Questions

Debate between Charlotte Nichols and Paul Scully
Wednesday 7th July 2021

(3 years, 5 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

The Minister for Equalities is doing a lot of work in this area, as is our Department of Health and Social Care. We are committed to reducing inequalities in health outcomes, and Professor Jacqueline Dunkley-Bent OBE, the chief midwifery officer, is leading work to understand why mortality rates are high, consider evidence and bring action together, because this is a complex situation. It is not just within maternity; it is far more holistic than that, for instance on whether people are accessing health services in the first place, and with the fact that we had some of the highest rates in the EU of obesity and underweight issues going into maternity and the highest rates of smoking in pregnancy in the EU—indeed, our level is even higher than America’s.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- View Speech - Hansard - -

Research from the TUC has found that one in four pregnant women and new mums experienced unfair treatment or discrimination at work during the pandemic, including being singled out for redundancy or furlough. The imminent tapering off of furlough prompts serious concern about unequal redundancies. Will the Minister follow Labour’s lead and, instead of the Government simply extending their ineffective and complicated laws, make things simpler and more robust for mothers and businesses alike by introducing a German-style ban on making a pregnant woman or new mother redundant from notification of pregnancy to six months after they return to work?

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

We believe that extending the MAPLE—Maternity and Parental Leave etc. Regulations 1999—provisions is a better way of doing it that goes with the grain of the tribunal system that we have within this country. That is why, after due consideration, we will be bringing that forward as soon as parliamentary time allows.

Employment Rights

Debate between Charlotte Nichols and Paul Scully
Tuesday 8th June 2021

(3 years, 6 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab) [V]
- View Speech - Hansard - -

Polling for the GMB union found that 76% of the public want fire and rehire to be banned, including 71% of Conservative voters. If only unscrupulous employers use fire-and-rehire tactics, as the Minister said in a previous answer, a non-legislative solution will do absolutely nothing. How much more consensus is needed before the Minister acts to ban fire and rehire, rather than warm words that do nothing to protect workers in his constituency or mine?

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

I have noticed that I can shrink my long list of responsibilities in the ministerial portfolio down to Minister for unintended consequences. I do not want to have a series of legislation, which is a blunt instrument, as if it is tackling a binary tool. That would have unintended consequences for people’s jobs and livelihoods. We want to have a flexible economy so that we get both right.

Oral Answers to Questions

Debate between Charlotte Nichols and Paul Scully
Wednesday 13th January 2021

(3 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend for what she is doing to encourage such employment. We are committed to having a fair recovery for all. During the crisis we have rolled out unprecedented levels of support to protect jobs for both women and men. Yes, of course I would be happy to meet her to discuss what more we can do to stimulate employment, including female employment, in the months ahead.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - -

Research by the Trades Union Congress shows that about 90% of mothers have taken on more childcare responsibilities since the pandemic began, with 43% having to balance childcare with working from home. This is a particular pressure for single-parent households, the majority of which, research shows, are headed by women. With women at greater risk of redundancy and disproportionately employed in sectors hardest hit by shutdowns, will the Minister commit to creating a legal, enforceable and immediate right for parents to request paid, flexible furlough?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Certainly furlough is available for women and, indeed, men who have childcare responsibilities. It is the responsibility of the employer whether to give that, but if women feel unduly disadvantaged, they can approach ACAS.

Arcadia and Debenhams: Business Support and Job Retention

Debate between Charlotte Nichols and Paul Scully
Wednesday 2nd December 2020

(4 years ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I know Guildford very well. It is a destination for residents around Surrey and further afield. Yes, we must all work together to get the balance right so that we do not hollow out our town centres, including Guildford.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - -

The Debenhams liquidation is a tragedy not only for the thousands of Debenhams employees but for all retailers in shopping centres like Warrington’s Golden Square, where Debenhams is the anchor department store driving footfall for the whole centre. With Arcadia brand stores in Golden Square also at risk, and confidence in the wider retail sector waning, what specific support will shopping centres like Golden Square get to protect all its retailers, their employees, and the vibrancy of our town centres?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In terms of shopping centres it is really important that we get the balance right between landlords and tenants. The moratorium helps tenants but clearly does not help landlords, so we have to get the balance right. We will work with the retail sector to try to achieve that balance in the weeks and months to come.