14 Luke Evans debates involving the Department for Digital, Culture, Media & Sport

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Mon 30th Nov 2020
Telecommunications (Security) Bill
Commons Chamber

2nd reading & 2nd reading & 2nd reading: House of Commons & Carry-over motion & Carry-over motion: House of Commons & Money resolution & Money resolution: House of Commons & Programme motion & Programme motion: House of Commons & Ways and Means resolution & Ways and Means resolution: House of Commons & 2nd reading & Programme motion & Money resolution & Ways and Means resolution & Carry-over motion

Online Safety Bill

Luke Evans Excerpts
2nd reading
Tuesday 19th April 2022

(3 years, 9 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Draft Online Safety Bill Report

Luke Evans Excerpts
Thursday 13th January 2022

(4 years ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- View Speech - Hansard - -

I come to this report through the prism of my work on body image. The Minister will be pleased to hear that I will not give again the speech that I delivered yesterday, when he was kind enough to join proceedings on my private Member’s Bill about digitally altered body images that should carry a logo. Although I would welcome the Government taking on that Bill, I have to play on the Government’s playing field, which has led me to assess this Bill through that prism.

I should congratulate the Government on what they are trying to achieve: a world-leading, world-beating risk assessment across the internet. To achieve that would be no mean feat. I have not heard mentioned enough the role that Ofcom will play. Having met Ofcom, I know that it would need the tools and ability to investigate and to levy very heavy fines and punishments on companies for breaching the rules. They are going to be the key to holding this all together.

Body image falls on the side of content that is legal but harmful. Clause 46(3) of the draft Bill states:

“Content is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities”.

It repeats that in several versions. I am pleased to see that that matches up with the report, but I appreciate that there is a difference of opinion on whether clause 11 should remain. Both pick up on the fact that

“Knowingly false communications likely to cause significant physical or psychological harm to a reasonable person”

should be called out. The report goes on to state:

“As with the other safety duties, we recommend that Ofcom be required to issue a mandatory code of practice to service providers on how they should comply with this duty. In doing so they must identify features and processes that facilitate sharing and spread of material in these named areas and set out clear expectations of mitigation and management strategies”.

After reading those points, both in the Bill and the report, I think a gap has been missed. There is no problem with seeing one doctored image; it is the volume of doctored images—the repeated images of shoulders distorted, waists thinner, breasts bigger—that has an impact. That is the same with people who are looking for information on dietary requirements. My hon. Friend the Member for Gosport (Dame Caroline Dinenage), who is no longer in her place, hit the nail on the head perfectly. It is about algorithms. That is where I want the Bill to be stronger. In every meeting that I have had with TikTok, Instagram, Facebook or Snapchat—you name it—when I have asked about algorithms, they say, “We can’t tell you more about it because it’s commercially sensitive,” but they are fundamentally what is driving us down the rabbit holes that the report rightly picks up on. How will we in this House determine what things look like if we do not understand what is driving them there in the first place? The horse has literally left the stables by the time we are picking up the pieces.

I am pleased that in previous debates the Minister has said that Ofcom will be able to request this information, but I would ask that we go one step further and say that that information could be exposed to the public. Why? Because that will undermine the whole model driving these companies in their commercial activity, because it will lay it bare for us all to see. That is key to the transparency that we need. Otherwise, how do we police the volume of images that are being delivered to our young people, whether they are body images or about self-harm, race hate or race-baiting, or whatever people want to call or it or whatever their niche happens to be? As we heard in this debate, social media plays on not only people’s interests, but their insecurities. That is what we have to tighten up on. The Bill and this report, working in conjunction, can really do that. However, I urge that the volume and, most importantly, the algorithms are considered.

Body Image in the Media and Online

Luke Evans Excerpts
Wednesday 24th November 2021

(4 years, 2 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

Before we begin, I remind Members that they are expected to wear face coverings when they are not speaking in this debate. This is in line with current Government guidance and that of the House of Commons Commission. I remind Members that they are asked by the House to have a covid lateral flow test twice a week if they are coming on to the parliamentary estate. This can be done either at the testing centre in the House or at home. Please also give each other space when you are seated, and when leaving or entering the Chamber.

I will now call Dr Luke Evans to move the motion and then I will call the Minister to respond. There will not be an opportunity for Dr Evans to sum up at the end, as is the convention for 30-minute debates.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - -

I beg to move,

That this House has considered Government action on body image in the media and online.

It is a pleasure to serve under your chairmanship, Mr Dowd, and I welcome the Minister to his role. This is the first time that I have formally met him to discuss this issue.

I will open with a description of an advert put out about a year ago by Dove, called “Reverse Selfie”. It starts with a young girl looking at her phone. On that phone, there is a picture of her. She may be in her late teens or early twenties. She starts to scroll backwards. She sees the comments underneath the photo suddenly disappearing, with all the “likes” going away and the comment, “You look amazing”, disappearing. Suddenly, the filter changes and so does her hair colour. The size of her face, including her nose, changes, and the blemishes on her skin all suddenly reappear. The process goes further. She puts the phone down, lies backwards and there is a picture of her family, which she has scrubbed off the back of the wall, and a picture of her favourite teen band. Furthermore, the image shows her makeup, including her lipstick, coming off. Finally, what is left in front of us is a girl no older than 13 or 14. The advert then finishes with the line:

“The pressure of social media is hurting our girls’ self-esteem”.

The advert is only a minute long, so if people have a chance I encourage them to look at it, because it encapsulates perfectly the kind of world in which we now exist, and the problem is getting worse.

Over the next few minutes, I will set out three points to address when it comes to debating this issue of body image and what the Government can do. The first is the scale of the problem; the second is why it matters; and the third, and most important, is what we can do about it.

There are so many statistics out there, but I will quickly go through the scale of the problem. Evidence from Girlguiding shows that two in five girls between the ages of 11 and 16 have seen images online making them feel insecure or less confident about themselves, rising to 50% for those aged between 17 and 21. Some 55% of girls aged between 11 and 21 say that these images make them feel insecure by showing unattainably wealthy lifestyles or expensive clothes, and 94% agree that more should be done to protect them from body image pressures online. Some 90% of girls agree that there should be stricter rules for online advertisers. The Women and Equalities Committee heard that more than six in 10 women feel negative about their bodies. Factors including diet culture and being bombarded with images of photoshopped, sexualised women have negative impacts.

It is not just women; it is men as well. Some 35% of men aged between 16 and 40 say that they are unhappy with how they look; 48% say that they have struggled with their mental health because of unhappiness; and two in every five men feel pressured to have the perfect body.

It goes further. Work by the Mental Health Foundation found that 85% of under-18s thought that appearance was either very important or important, but it led to one in five adults and one in three teenagers feeling actual shame about the way they looked. The Women and Equalities Committee report said that the triggers include social media, stereotypes and, of course, conventional media. Those are just some of the survey results that give a flavour of what people in this country feel like.

Why does this actually matter? Let us take the worst-case scenario. As a clinician, I have seen more and more men —but also women—with concerns about body image. At worst, they suffer with eating disorders. There are 1.25 million people suffering with anorexia or bulimia. There are also 1 million people, particularly men, who are using steroids to bulk up, to try to get those stereotypical big shoulders or tight abs. That was brought home to me when UK Anti-Doping saw my campaign about body image and came to me with evidence of how much of a problem it is causing. It is finding that people who are using drugs for aesthetic enhancement are then turning up to play rugby only to then be banned from the sport.

On eating disorders, there has been a 50% increase in the number of people accessing services since 2016-17. We are seeing the worst extremes, but this is the thin end of the wedge. Combine those factors with what we have just talked about—the way the nation is feeling—and we see that there is an obvious cause for concern.

This matters and I do not believe that social media companies are doing enough. Many have filters, educational content, and ways of trying to filter out some of the problems they face, but, fundamentally, we need to go further, because the problem is getting worse and it is young people who are bearing the brunt of it. Other countries have started to make strides in addressing the problem, most notably Israel and France, and Norway has recently said that it will look at labelling digitally altered images. There is a precedent, therefore, not only here but across the western world, to make a difference.

What can we do? I am completely aware that the Minister is from the Department for Digital, Culture, Media and Sport, and that this problem cannot be solved with one single hit, because there is a chain. It involves parental responsibility and educating our children to be aware of the content they are looking at. Of course, when people go on to a platform, they need to have the tools to protect themselves, the platform needs to take responsibility and show them due care to, and if things go wrong, a regulator, backed up by legal statute, needs to be able to deal with it.

For the purpose of this debate, I will concentrate on three solutions that I think could make a difference. I proposed the first one last year—namely, labelling digitally altered images. It is a very simple process and we already have a precedent for it, with “PP” appearing for paid product placement advertising on TV. This is in line with the health aspect of providing information about calories and content on food labelling. It provides parity for mental health by saying that the image is not quite as it seems. We already have a precedent in advertising, as well, with video game adverts stating, “This is not actual video footage”, and, of course, political advertising is labelled as such.

I often use this example when people ask what that means: if someone wanted to sell their house or rent a room, it would be absolutely justifiable to paint the walls, put out a new throw and change the lighting. However, what they fundamentally could not do is digitally alter the size of the garden, the roof or the living space. That is what I am asking the Government to look at. We are creating a warped sense of reality that drives young people to believe they can be something that they can never achieve. I am all for aspiration and people improving their aesthetics—I am a GP by trade and I welcome heathy, promotional sport and exercise and people taking care of their bodies—but not if that is a goal that they can never achieve.

Critics of my position often say, “Hang on a second. Isn’t that the nanny state?”, but I would say it is not, because a perfect market needs perfect information. Others ask how it would work practically. The online space already distinguishes between organic and commercial activity according to the number of followers and what accounts use in their content, holding them to a different category of rules. I am not asking for a bride with a blemish to suddenly be punished or banned from dealing with that. I am simply asking for digitally altered images where biceps are made bigger and promoted online—or indeed in magazines—to carry a label.

Of course, that is my wish as a Back Bencher, but we have the draft Online Safety Bill, and I credit the Government for grabbing the bull by the horns and including a world-leading attempt to try to deal with some of the perils of the internet. That is really important, but there are some difficulties. How do we decide what goes in? How do we build a framework? Where does responsibility lie? I am pleased that there is a framework that covers social media companies with a duty of responsibility.

What I am talking about is not illegal, and that means that interpreting what is detrimental is mired in difficulty. Although an individual picture might not be detrimental, we start to have a problem when we are bombarded with 100 pictures of people with abs and shoulders the size of a fridge. However, I see a solution. Clause 46(3) of the draft Online Safety Bill, which sets out the meaning of content that is harmful to adults, states:

“Content is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse…psychological impact on an adult of ordinary sensibilities”,

with roughly the same wording in place for children as well. I put it to the Government that body image could well be a legal but harmful issue, and should be counted among the priority harms.

As I have said, the draft Online Safety Bill tries to cover a whole load of issues that are not related to body image. My final point and plea to the Government is about the thrust of the issue—what it all boils down to—which is the algorithm driving the content. I am interested in fitness, and I follow CrossFit on Instagram. If I log into my account, it sends me to hundreds of pictures of gents with their tops off, training harder than ever before. It is not an issue having the one image, but there is a real difficulty when hundreds of images are being driven to people. When I raise the issue with all the big social media companies and ask them how the algorithm works, the first thing I hear is, “That is commercially sensitive,” and therein lies the problem. If we do not know what the algorithm is driving to people, and if we do not understand it or have any clarity on it, how can we address the problem in the first place?

I am so pleased to see that Ofcom is in line to deal with the problem. I have met with it to see that it has not only the resources and the legal backing but the ability to punish companies, demand that they open up their algorithms and demand papers from them so that we can get to the bottom of this problem. I would be grateful if the Minister could confirm that that is indeed the Government’s intent, and whether or not algorithms will be included in the online harms Bill. While I have come to this from body image, it would help to deal with all sorts of other issues, be that fraud scams, self-harm or suicide.

I hope that in the past 10 minutes I have demonstrated the scale of the problem, why it matters—because the most vulnerable young people are the ones facing it—and some of the solutions for dealing with it. I look forward to the online safety Bill coming forward. I am aware of the Advertising Standards Authority’s call for evidence about body image and the Government may know that I have launched a petition called #recognisebodyimage to make sure that body image is recognised in UK law for the first time. I hope that might just make a slight bit of difference for the young girl or boy who enjoys spending their time on social media.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a great pleasure to serve under your chairmanship, Mr Dowd. We used to appear opposite each other on occasion, so it is nice to serve under your chairmanship now. I start by congratulating my hon. Friend the Member for Bosworth (Dr Evans) on tabling this important topic for discussion this afternoon and for speaking with such eloquence. The examples he gave were powerful and make a strong case for the need to do more in this important area for the sake of all our children and, indeed, many adults who suffer problems and issues as a result of images they see online. I take the opportunity to assure the House and, indeed, the public that the Government takes those problems seriously.

There are two projects under way designed to address exactly those issues, which provide useful platforms for doing more. My hon. Friend the Member for Bosworth touched on both. The first is the online advertising programme, which, as the name implies, is designed to address the content of paid-for online advertising, where some of the images he describes appear. As he said, the Advertising Standards Authority launched a call for evidence on 21 October that remains open until 13 January, so there are opportunities for people to make their views known. I hope that the ASA will be able to do more in this area in response to that.

In the coming months, the online advertising programme consultation will be launched and, again, that will be an extremely useful vehicle into which points such as the ones made today can be fed. That will likely lead in due course to further measures in the online advertising space. It is clear that there is a real opportunity through the programme to do more in this area. Given the call for evidence and the consultation in the coming months, the issues raised by my hon. Friend the Member for Bosworth are extremely timely and very welcome. He has picked his moment with a great deal of good fortune.

There is not just the question of advertising but that of user-generated content, and that is in the scope of the draft Online Safety Bill, which my hon. Friend mentioned. It was published last May and I can see he has a copy of it in front of him, which is diligently tagged up. I am delighted he has been studying it so carefully.

As hon. Members will know, the draft Bill is currently going through a pre-legislative scrutiny process. A Joint Committee of both the House of Commons and the House of Lords, chaired by my hon. Friend the Member for Folkestone and Hythe (Damian Collins), is looking very carefully at it. The Committee has taken extensive evidence and will be publishing a report on or before 10 December, which may well address some of the issues. The Government are certainly in listening mode on the draft Online Safety Bill and we are ready to make changes, amendments and improvements to the Bill where there is a case to do so. There is scope for us to do more in this area. The Bill has a number of important mechanisms that will directly help address some of the issues that have been raised.

Let me pick up a couple of the points raised by my hon. Friend the Member for Bosworth. First, he mentioned the importance of algorithms. As he said, this applies not only to matters of body image and the fact that he has lots of pictures of well-built men appearing in his timeline, for the reasons that he explained, but elsewhere. These algorithms drive all kinds of content, some of which is harmful. In fact, Frances Haugen, the Facebook whistleblower, explained how the algorithms promote content that is often harmful, or even hateful, to individuals for purely commercial reasons. The algorithms do that not through any exercise of editorial judgment, but simply to drive user engagement, and therefore revenue, for the companies concerned. It is a purely commercial, profit-driven activity.

My hon. Friend made a point about transparency. When they are asked to talk a bit more about what these algorithms do, the companies very often refuse to disclose what is going on. Therefore, some of the most important measures in the draft Online Safety Bill are to do with transparency. There is a transparency duty on the category 1 companies—the largest companies—to be transparent about what is going on.

There are also powerful information rights for Ofcom, whereby Ofcom can require the companies concerned to provide information about a whole range of things, including algorithms. Companies will have to provide that information to Ofcom, providing the transparency that is so woefully lacking. If they fail to meet either the transparency duty or the information duty, that is, responding to an information request, they can be fined up to 10% of their global revenue. In the case of the information disclosure duties, not only can the company be punished by way of extremely large fine, but there will also be personal criminal liability for named executives. There will be a big change in the transparency about algorithms and how information is provided.

In the context of the draft Online Safety Bill, my hon. Friend also mentioned content that is legal but harmful. There is clearly a strong case to say that material that causes either young people or adults to develop anxiety about their body image can potentially be harmful. Once we have passed the Bill, the Ofcom consultation process will define the priority harms, which will be the harms where category 1 companies will have to take particular care. They will have to lay out in their terms and conditions how they will address issues with priority harms. There is a mechanism through which representations can be made, and the argument can be made that matters concerning body image ought to included.

Luke Evans Portrait Dr Luke Evans
- Hansard - -

I am very grateful for the comprehensive answers that the Minister is giving. On that secondary point, will the consultation be coming back to the House of Commons to determine those priorities or will they be set out after a consultation that will be delivered straight to Ofcom for it to make its judgment?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There will be an extensive consultation run by Ofcom, both on the matters considered to be priority harms and on the codes of practice that go alongside those. The Bill, as drafted, will see those codes of practice and the list of harms come back to the Secretary of State, and there will then be a parliamentary procedure, so Parliament will have an opportunity to look at the list of priority harms and the codes of conduct to be sure that Parliament is happy with them. There are various debates about whether the mechanisms to do that can be fine-tuned in some way, but it will not just disappear into a void with no further interaction with Parliament at all. In providing evidence to Ofcom, there will be an opportunity for my hon. Friend and for people who are campaigning with such passion on this issue to make representations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention, given that it is built on years of expertise in the industry. These issues require careful thought and there are balances to strike. We do not want to cause unreasonable problems for the advertising industry.

That is why the Government and various regulatory authorities are looking at this in such a careful way, with the call for evidence that is running at the moment, the consultations in the coming months on the online advertising programme and the consultation on the priority harms and codes of conduct that Ofcom will conduct in relation to the online safety Bill. Through those consultations, there will be an opportunity for campaigners to put forward their point of view on body image. Obviously, the advertising industry will have extensive opportunities to put its case. There will be opportunities for regulators and Parliament to think about how that balance can most appropriately be struck. We fully recognise that, as in so many areas, there is a balance to strike in ensuring we reach the right solution.

Luke Evans Portrait Dr Luke Evans
- Hansard - -

I absolutely agree on striking that balance. To address the earlier intervention, I hope that no one would ever see a label on these images, because companies would be socially responsible and choose not to doctor them. However, should those images be doctored for any reason, having that label—a small “p”, a small “b”, or whatever it happens to be—alerts the user to the fact that, when they are scrolling through hundreds of images, particularly on social media, all is not as it seems. I think that is a fair balance.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention. His comment is probably directed as much at my hon. Friend the Member for Woking (Mr Lord) as at me. Clearly, there are important points to debate.

In conclusion, the Government take the issue extremely seriously, not just in the Department for Digital, Culture, Media and Sport but across Government, such as in the Department of Health and Social Care and other Departments. We recognise that serious psychological harm is potentially being caused, particularly to young people but more widely as well. We want to ensure that reasonable steps are taken to avoid harm being inflicted.

I hope Members across the House, with opinions on both sides of the argument, will fully engage with the consultation on the online advertising programme and the call for evidence from the Advertising Standards Authority. I hope they will also fully engage, after the Bill passes, with Ofcom when it consults on the priority harms and codes of conduct. Some extremely important issues and arguments have surfaced on both sides in today’s debate. We look forward to debating the matter further in the coming months to ensure we strike that balance. We need to protect people who need protection, so that the internet is not an ungoverned, lawless space where anything goes, but equally we need to ensure that industries, such as advertising, are not unduly penalised or circumscribed. I am confident that the House, on a cross-party basis, can apply its collective wisdom and strike that balance. I look forward to working with colleagues to achieve that.

Question put and agreed to.

Telecommunications (Security) Bill

Luke Evans Excerpts
2nd reading & 2nd reading: House of Commons & Carry-over motion & Carry-over motion: House of Commons & Money resolution & Money resolution: House of Commons & Programme motion & Programme motion: House of Commons & Ways and Means resolution & Ways and Means resolution: House of Commons
Monday 30th November 2020

(5 years, 2 months ago)

Commons Chamber
Read Full debate Telecommunications (Security) Act 2021 View all Telecommunications (Security) Act 2021 Debates Read Hansard Text Read Debate Ministerial Extracts
Oliver Dowden Portrait Oliver Dowden
- Hansard - - - Excerpts

I will make some progress. I may come back to the right hon. Gentleman later, but I have already given way to him twice.

I know that some Members are concerned that we have not named Huawei on the face of the Bill and that our approach could be reversed in years to come. I want to reassure those Members on a number of fronts. We have not chosen to name Huawei for two compelling practical reasons. First, as we discussed, this Bill is designed to tackle not only the Huaweis of today but the Huaweis of tomorrow, wherever they come from. It needs to be flexible enough to cover future threats and not tie our hands by limiting our response to one company and one company alone. Secondly—this is the most crucial point—making reference to any one company would create a hybrid Bill, dramatically slowing the passage of the Bill and therefore our ability to combat all high-risk vendors, including Huawei.

However, as a concrete sign of our commitment to tackling the national security risks posed by Huawei, I can confirm today that we are going further in two significant ways. First—I hope Members will have had a chance to see this—we have published an illustrative designation notice and an illustrative designated vendor direction to demonstrate how the Bill’s powers in relation to a high-risk vendor could be exercised. Given the level of concern in this House and in the other place about Huawei’s role in 5G infrastructure, these illustrative drafts name Huawei explicitly, clarifying our position beyond doubt, and set out a clear pathway to the reduction and removal of its equipment.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - -

Does the Secretary of State believe that taking out companies such as Huawei may damage the economic impact, and what assessment has he made about making sure that we are at the forefront of growing 5G network in the UK?