Debates between Alex Davies-Jones and Maria Miller during the 2019-2024 Parliament

Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Tue 28th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Thu 16th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Tue 24th May 2022

Online Safety Bill

Debate between Alex Davies-Jones and Maria Miller
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Online Safety Bill (Seventeenth sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage
Tuesday 28th June 2022

(2 years, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. On behalf of the Back Benchers, I thank you and Sir Roger for your excellent chairpersonships, and the Minister and shadow Ministers for the very courteous way in which proceedings have taken place. It has been a great pleasure to be a member of the Bill Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am content with the Minister’s assurance that the provisions of new clause 41 are covered in the Bill, and therefore do not wish to press it to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Schedule 2

Recovery of OFCOM’s initial costs

Recovery of initial costs

1 (1) This Schedule concerns the recovery by OFCOM of an amount equal to the aggregate of the amounts of WTA receipts which, in accordance with section 401(1) of the Communications Act and OFCOM’s statement under that section, are retained by OFCOM for the purpose of meeting their initial costs.

(2) OFCOM must seek to recover the amount described in sub-paragraph (1) (“the total amount of OFCOM’s initial costs”) by charging providers of regulated services fees under this Schedule (“additional fees”).

(3) In this Schedule—

“initial costs” means the costs incurred by OFCOM before the day on which section 75 comes into force on preparations for the exercise of their online safety functions;

“WTA receipts” means the amounts described in section 401(1)(a) of the Communications Act which are paid to OFCOM (certain receipts under the Wireless Telegraphy Act 2006).

Recovery of initial costs: first phase

2 (1) The first phase of OFCOM’s recovery of their initial costs is to take place over a period of several charging years to be specified in regulations under paragraph 7 (“specified charging years”).

(2) Over that period OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the total amount of OFCOM’s initial costs.

(3) OFCOM may not charge providers additional fees in respect of any charging year which falls before the first specified charging year.

(4) OFCOM may require a provider to pay an additional fee in respect of a charging year only if the provider is required to pay a fee in respect of that year under section 71 (and references in this Schedule to charging providers are to be read accordingly).

(5) The amount of an additional fee payable by a provider is to be calculated in accordance with regulations under paragraph 7.

Further recovery of initial costs

3 (1) The second phase of OFCOM’s recovery of their initial costs begins after the end of the last of the specified charging years.

(2) As soon as reasonably practicable after the end of the last of the specified charging years, OFCOM must publish a statement specifying—

(a) the amount which is at that time the recoverable amount (see paragraph 6), and

(b) the amounts of the variables involved in the calculation of the recoverable amount.

(3) OFCOM’s statement must also specify the amount which is equal to that portion of the recoverable amount which is not likely to be paid or recovered. The amount so specified is referred to in sub-paragraphs (4) and (5) as “the outstanding amount”.

(4) Unless a determination is made as mentioned in sub-paragraph (5), OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the outstanding amount.

(5) The Secretary of State may, as soon as reasonably practicable after the publication of OFCOM’s statement, make a determination specifying an amount by which the outstanding amount is to be reduced, and in that case OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the difference between the outstanding amount and the amount specified in the determination.

(6) Additional fees mentioned in sub-paragraph (4) or (5) must be charged in respect of the charging year immediately following the last of the specified charging years (“year 1”).

(7) The process set out in sub-paragraphs (2) to (6) is to be repeated in successive charging years, applying those sub-paragraphs as if—

(a) in sub-paragraph (2), the reference to the end of the last of the specified charging years were to the end of year 1 (and so on for successive charging years);

(b) in sub-paragraph (6), the reference to year 1 were to the charging year immediately following year 1 (and so on for successive charging years).

(8) Any determination by the Secretary of State under this paragraph must be published in such manner as the Secretary of State considers appropriate.

(9) Sub-paragraphs (4) and (5) of paragraph 2 apply to the charging of additional fees under this paragraph as they apply to the charging of additional fees under that paragraph.

(10) The process set out in this paragraph comes to an end in accordance with paragraph 4.

End of the recovery process

4 (1) The process set out in paragraph 3 comes to an end if a statement by OFCOM under that paragraph records that—

(a) the recoverable amount is nil, or

(b) all of the recoverable amount is likely to be paid or recovered.

(2) Or the Secretary of State may bring that process to an end by making a determination that OFCOM are not to embark on another round of charging providers of regulated services additional fees.

(3) The earliest time when such a determination may be made is after the publication of OFCOM’s first statement under paragraph 3.

(4) A determination under sub-paragraph (2)—

(a) must be made as soon as reasonably practicable after the publication of a statement by OFCOM under paragraph 3;

(b) must be published in such manner as the Secretary of State considers appropriate.

(5) A determination under sub-paragraph (2) does not affect OFCOM’s power—

(a) to bring proceedings for the recovery of the whole or part of an additional fee for which a provider became liable at any time before the determination was made, or

(b) to act in accordance with the procedure set out in section 120 in relation to such a liability.

Providers for part of a year only

5 (1) For the purposes of this Schedule, the “provider” of a regulated service, in relation to a charging year, includes a person who is the provider of the service for part of the year.

(2) Where a person is the provider of a regulated service for part of a charging year only, OFCOM may refund all or part of an additional fee paid to OFCOM under paragraph 2 or 3 by that provider in respect of that year.

Calculation of the recoverable amount

6 For the purposes of a statement by OFCOM under paragraph 3, the “recoverable amount” is given by the formula—

C – (F – R) - D

where—

C is the total amount of OFCOM’s initial costs,

F is the aggregate amount of the additional fees received by OFCOM at the time of the statement in question,

R is the aggregate amount of the additional fees received by OFCOM that at the time of the statement in question have been, or are due to be, refunded (see paragraph 5(2)), and

D is the amount specified in a determination made by the Secretary of State under paragraph 3 (see paragraph 3(5)) at a time before the statement in question or, where more than one such determination has been made, the sum of the amounts specified in those determinations.

If no such determination has been made before the statement in question, D=).

Regulations about recovery of initial costs

7 (1) The Secretary of State must make regulations making such provision as the Secretary of State considers appropriate in connection with the recovery by OFCOM of their initial costs.

(2) The regulations must include provision as set out in sub-paragraphs (3), (4) and (6).

(3) The regulations must specify the total amount of OFCOM’s initial costs.

(4) For the purposes of paragraph 2, the regulations must specify—

(a) the charging years in respect of which additional fees are to be charged, and

(b) the proportion of the total amount of initial costs which OFCOM must seek to recover in each of the specified charging years.

(5) The following rules apply to provision made in accordance with sub-paragraph (4)(a)—

(a) the initial charging year may not be specified;

(b) only consecutive charging years may be specified;

(c) at least three charging years must be specified;

(d) no more than five charging years may be specified.

(6) The regulations must specify the computation model that OFCOM must use to calculate fees payable by individual providers of regulated services under paragraphs 2 and 3 (and that computation model may be different for different charging years).

(7) The regulations may make provision about what OFCOM may or must do if the operation of this Schedule results in them recovering more than the total amount of their initial costs.

(8) The regulations may amend this Schedule or provide for its application with modifications in particular cases.

(9) Before making regulations under this paragraph, the Secretary of State must consult—

(a) OFCOM,

(b) providers of regulated user-to-user services,

(c) providers of regulated search services,

(d) providers of internet services within section 67(2), and

(e) such other persons as the Secretary of State considers appropriate.

Interpretation

8 In this Schedule—

“additional fees” means fees chargeable under this Schedule in respect of the recovery of OFCOM’s initial costs;

“charging year” has the meaning given by section76;

“initial charging year” has the meaning given by section76;

“initial costs” has the meaning given by paragraph 1(3), and the “total amount” of initial costs means the amount described in paragraph 1(1);

“recoverable amount” has the meaning given by paragraph 6;

“specified charging year” means a charging year specified in regulations under paragraph 7 for the purposes of paragraph 2.” —(Chris Philp.)

This new Schedule requires Ofcom to seek to recover their costs which they have incurred (before clause 75 comes into force) when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services.

Brought up, read the First and Second time, and added to the Bill.

Online Safety Bill (Thirteenth sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We support clause 132, which ensures that Ofcom is required to understand and measure public opinion concerning providers of regulated services, as well as the experiences and interests of those using the regulated services in question. The Bill in its entirety is very much a learning curve for us all, and I am sure we all agree that, as previously maintained, the world really is watching as we seek to develop and implement the legislation. That is why it is vital that Ofcom is compelled to conduct and arrange its own research to ensure that we are getting an accurate picture of how our regulatory framework is affecting people. I stress to the Minister that it is imperative that Ofcom consults all service providers—big and small—which the CBI stressed to me in recent meetings.

We also welcome the provisions outlined in subsection (2) that confirm that Ofcom must include a statement of its research in its annual report to the Secretary of State and the devolved Administrations. It is important that Ofcom, as a regulator, takes a research-led approach, and Labour is pleased to see these provisions included in the Bill.

We welcome the inclusion of clause 133, which extends the communication panel’s remit to include online safety. This will mean that the panel is able to give advice on matters relating to different types of online content under the Bill, and on the impacts of online content on UK users of regulated services. It is a welcome step forward, so we have not sought to amend the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I want to make one short comment about clauses 132 and 133, which are really important. There is no intention to interfere with or fetter the way that Ofcom operates, but there is an obligation on this Committee, and on Parliament, to indicate what we would expect to see from Ofcom by way of the clauses, because they are an essential part of the transparency that we are trying to inject into the sector.

Research about users’ experiences is hugely important, and such reports contain important insights into how platforms are used, and the levels of misinformation and disinformation that people are exposed to. Ofcom already produces highly authoritative reports on various aspects of the online world, including the fact that three in four adults do not think about whether the online information that they see is truthful. Indeed, one in three adults believes that all or most information that they find online is truthful. We know that there is a significant gap between consumers perception and reality, so it is important to ensure that research has good exposure among those using the internet.

We do not often hear about the problems of how the online world works, and the level of disinformation and inaccuracy is not well known, so will the Minister elaborate on how he expects Ofcom to ensure that people are aware of the reality of the online world? Platforms will presumably be required to have regard to the content of Ofcom reports, but will Ofcom be required to publicise its reports? It is not clear that such a duty is in the Bill at the moment, so does the Minister expect Ofcom to have a role in educating people, especially children, about the problem of inaccurate data or other aspects of the online world?

We know that a number of platforms spend a great deal of money on going into schools and talking about their products, which may or may not entail accurate information. Does Ofcom not have an important role to play in this area? Educating users about the changes in the Bill would be another potential role for Ofcom in order to recalibrate users’ expectations as to what they might reasonably expect platforms to offer as a result of the legislation. It is important that we have robust regulatory frameworks in place, and this Bill clearly does that. However, it also requires users to be aware of the changes that have been made so that they can report the problems they experience in a timely manner.

Online Safety Bill (Twelfth sitting)

Debate between Alex Davies-Jones and Maria Miller
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

Online Safety Bill (Tenth sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage
Tuesday 14th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I have some brief comments on the clause. The Labour party very much welcomes the addition to user verification duties in the revised Bill. A range of groups, including Clean Up the Internet, have long campaigned for a verification requirement process, so this is a positive step forward.

We do, however, have some concerns about the exact principles and minimum standards for the user verification duty, which I will address when we consider new clause 8. We also have concerns about subsection (2), which states:

“The verification process may be of any kind (and in particular, it need not require documentation to be provided).”

I would be grateful if the Minister could clarify exactly what that process will look like in practice.

Lastly, as Clean Up the Internet has said, we need further clarification on whether users will be given a choice of how they verify and of the verification provider itself. We can all recognise that there are potential down- sides to the companies that own the largest platforms —such as Meta, Google, Twitter and ByteDance—developing their own in-house verification processes and making them the only option for users wishing to verify on their platform. Indeed, some users may have reservations about sharing even more personal data with those companies. Users of multiple social media platforms can find it inconvenient and confusing, and could be required to go through multiple different verification processes on different platforms to achieve the same outcome of confirming their real name.

There is a risk of the largest platforms seeking to leverage their dominance of social media to capture the market for ID verification services, raising competition concerns. I would be grateful if the Minister could confirm his assessment of the potential issues around clause 57 as it stands.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for giving way. I can understand the intent behind what she is saying and I have a huge amount of sympathy for it, but we know as a matter of fact that many of the images that are lodged on these sorts of websites were never intended to be pornographic in the first place. They may be intimate images taken by individuals of themselves—or, indeed, of somebody else—that are then posted as pornographic images. I am slightly concerned that an image such as that may not be caught by the hon. Lady’s amendments. Would she join me in urging the Government to bring forward the Law Commission’s recommendations on the taking, making and sharing of intimate images online without consent, which are far broader? They would probably do what she wants to do but not run into the problem of whether an image was meant to be pornographic in the first place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

Online Safety Bill (Ninth sitting)

Debate between Alex Davies-Jones and Maria Miller
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.

The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.

Online Safety Bill (Eighth sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage
Thursday 9th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

Online Safety Bill (Fourth sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I ask a supplementary to that before I come on to my main question?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Online Safety Bill (Third sitting)

Debate between Alex Davies-Jones and Maria Miller
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

Online Safety Bill (First sitting)

Debate between Alex Davies-Jones and Maria Miller
None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Access to Cash

Debate between Alex Davies-Jones and Maria Miller
Wednesday 20th October 2021

(3 years, 1 month ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Maria Miller Portrait Mrs Maria Miller (in the Chair)
- Hansard - - - Excerpts

I encourage Members to wear masks when they are not speaking, and to give each other space when moving around, or entering or leaving the room.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I beg to move,

That this House has considered access to cash.

It is a pleasure to serve under your chairship, Mrs Miller. Given recent events, I feel it is important to take a moment to pay tribute to our wonderful colleague, Sir David Amess, who was a regular contributor to Westminster Hall debates. His presence here will forever be missed.

I am pleased to have secured this debate, particularly as our ability to physically access cash has been restricted as we continue to tackle coronavirus, and given the recent increase to the contactless card spending limit from £45 to £100.

I come to this debate with a specific constituency interest in mind. One of the jewels in the Pontypridd crown is the Royal Mint, based in Llantrisant. It is a major local employer, and I must give its tourist attraction, the Royal Mint Experience, a quick plug. The Royal Mint is the manufacturer of UK coins, and is not directly involved in policy on the use of cash, but it is a key contributor to ensuring that certain skills, and the capability to circulate coins, still exist in this country. I was joined there by the Under-Secretary of State for Wales, the hon. Member for Monmouth (David T. C. Davies), only a few months ago; we struck coins, and met young people on the kickstart scheme. I will, however, try to refrain from reminding the Minister that despite all the country’s coins being made in my constituency, we sadly see precious little money in return from the Government. Perhaps that is a matter to be discussed another time.

Instead, I will focus on the sad, widespread repercussions of reduced cash flow, which is having a major impact on high streets up and down the country. Many have been hit by multiple bank closures, including in my constituency of Pontypridd and across Caerphilly. Banks not only provide vital services for a huge range of community groups, but are often the epicentre of our high streets, and are vital in encouraging local trade and footfall for surrounding businesses.

Innovation in Hospital Design

Debate between Alex Davies-Jones and Maria Miller
Tuesday 4th February 2020

(4 years, 9 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I beg to move,

That this House has considered innovation in hospital design.

It is a pleasure to serve under your chairmanship, Ms Nokes, and to hold this important debate. Last September, the Government gave the green light to 40 new hospitals, as part of the health infrastructure plan. For my Basingstoke community this means support for a hospital replacing a much-loved building, built over 40 years ago.

We wanted a new hospital six years ago. While it is important to refresh those plans, because we are now talking about a district hospital, not just a critical treatment hospital, we already have a great deal of work in place. The initial community consultation has identified widespread support. The ambulance service has identified the location that would save more lives. The local council has given planning consent for a hospital to be built.

What about the building itself? If we are to realise the full benefits of this once-in-a-generation opportunity for our healthcare infrastructure, we need not the fad of the moment, but the best design for our hospitals based on evidence and the needs of clinicians, patients, staff and the community, as well as research at home and abroad, to create the best blueprint for local trusts to use for the next generation of NHS hospitals.

Guidance on how to design a new hospital, provided by the NHS to hospital trusts, has been called “out of date” by Architects for Health, an organisation dedicated to improving healthcare design. That should concern us. I hope that the Minister will reassure me that any new hospital will benefit from the best design thinking based on the best evidence around the world.

Many of the crucial design factors identified through research by design experts are completely absent in many hospitals within the NHS estate. Many of our hospitals, including our hospital in Basingstoke, were built for a different era of medicine. The buildings have been modified, added to, partially knocked down and rebuilt, and prefabricated units have been built in front of old units. Any sense of coherence in the design of our hospitals has long been lost.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I hope the right hon. Lady agrees that, historically, hospitals have been built away from where those services were most needed, causing issues with the recruitment of consultants and doctors, who then have to work with a demographically and geographically diverse population. I hope that location is given full consideration when new hospitals are designed.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

The hon. Lady is absolutely right. That is why I was pleased that the NHS trust in Hampshire went to the emergency services and asked where the best location for a hospital would be. They identified junction 7 on the M3—an area not unknown to you, Ms Nokes—as a fantastic location. It would be convenient not only for staff, but for the ambulance service, so that it could can save more lives. All these issues must be taken into account.

We have experience of building hospitals since Basingstoke and North Hampshire Hospital was built in the 1970s. Newer private finance initiative hospitals have often been debated in this place. Interesting research has been done on their design, showing that the innovative use design was inhibited because private finance saw those hospitals foremost as an investment vehicle, and tried to reduce risk by using conventional design and construction methods—looking to the past rather than the future. We cannot make the same mistakes again.

What makes a good hospital for now, or, better still, for 2060, when these hospitals will still be operating? Based on the past 40 years of experience, we know the next generation of hospitals must be flexible in their design, not only to accommodate change, but to be built with change in mind and not as an afterthought. I am sure that some elements can be standardised, but the overall design must be flexible. Some new hospitals are built with the intention that they may have an entirely different use in the future. In our communities there are successful examples of buildings that began with one intended use and have moved to another, but they are few and far between. We need to ensure these hospitals have that flexibility, to scale up, change, scale back and even change use entirely.

Patient treatment is the prime function of a hospital, but so is patient recovery. The prevalence of multimorbidity requires a different way of thinking. Perhaps people with mental and physical illness—indeed, those with both simultaneously—should be treated side by side. Rightly, our focus is on early detection and prevention, so part of any new hospital must be mobile, to take prevention of disease into the heart of our community, with the permanent migration of some services from hospitals to the community, including simple diagnostics and therapies.

Research from the US demonstrates the importance of the right environment for patient recovery, including noise reduction, air quality, green space, daylight and seeing nature. Unsurprisingly, all those elements promote good health in well people, too. In 1984, a study by Roger Ulrich proved that a view through a window of a natural setting—perhaps the Hampshire countryside—would aid recovery. Those who had a view of a natural scene had a shorter stay and fewer complications and required less pain relief than those with a window facing a brick wall. Those are not new ideas. Florence Nightingale insisted on every ward being flooded with sunlight, with windows that opened to bring in light and ventilation lifting the spirits, but that is not the case for every ward in my hospital and hospitals around the country.

Staff retention is one of the most acute issues for the NHS. NHS staff are hugely loyal and dedicated. The hundreds of people in Basingstoke who work in my local hospital go above and beyond every day in bringing the best care to my constituents. However, where we work matters, and we should not rely on that loyalty and dedication but reciprocate it. We need to think about how design can improve everyday working lives.

Office design has evolved over the past 40 years, creating spaces that encourage creative collaboration. However, in hospitals things have not changed much at all, yet collaboration and creativity are just as relevant in medicine as in commerce, as are training and upskilling, which should be designed into these new buildings.

Of course, a hospital’s environmental impact also needs to be minimised. The importance of renewable energy and public transport links goes without saying, but we need to take account of the actual design of the hospital, to ensure that it is a design that the surrounding community can be proud of, and so the hospital does not look as if it has landed from outer space and instead fits with the natural setting; a hospital should be a building that will add to that natural setting and not detract from it.

For this new generation of hospitals to be truly sustainable, there needs to be a move away from the disposable hospital design of the 1970s, which was perhaps used when the hospital in Basingstoke was built back in 1972. A building that is flexible and that can be repurposed is a building that is sustainable, which is the approach that we must take.

Each and every one of the 40 new hospitals will be a huge investment for taxpayers, and it is right that approval procedures are rigorous. However, I hope that my hon. Friend the Minister can assure us that, despite that rigour, the long-term benefits of the best working environment for staff are not traded for a short-term reduction in cost.

Hospitals are absolutely extraordinary places that do extraordinary things on a routine basis. They are places where we experience the most emotional experiences in our lives; they are the places where new life is brought into the world and where we face our darkest moments. I will always remember the birth of my three children in Queen Charlotte’s Hospital in London, even though there was a decision to move the hospital after the birth of my second child and I had to go to a new location for my third child. Nevertheless, to be surrounded by experts in maternity and midwifery was an extraordinary experience, and we always have a debt of gratitude to hospitals that have served us in that way. Now Basingstoke hospital is looking after my mother and my father in an extraordinary way, and we should always recognise the incredible lengths that the NHS goes to, in order to ensure that we have the right support in place at the right time.

That is why communities have such a profoundly emotional attachment to their hospitals. That is a challenge that the Government face as they introduce their plans for 40 new hospitals, because they must recognise the impact of any change to a building with which people have an emotional bond and attachment, whether they have had a baby or visited a dying relative there. We need to understand that and take the community with us.

I hope that my hon. Friend the Minister can outline today how the Government will ensure that this once-in-a-generation opportunity—these 40 new hospitals for communities right across the nation—involves good design. That means design that helps to provide the best treatment, the best recovery, the best staff retention and the best for our environment, and such design should be at the heart of each and every new hospital, because we must build hospitals for the future and not simply replicate the past. We also need to recognise the emotional role that hospitals play in the lives of our families and our communities. We must work with the people the NHS serves to ensure that this groundbreaking development of the NHS estate is understood, embraced and welcomed.