All 3 Debates between Kirsty Blackman and Sarah Owen

Tue 12th Dec 2023
Media Bill (Sixth sitting)
Public Bill Committees

Committee stage:s: 6th sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting

Media Bill (Sixth sitting)

Debate between Kirsty Blackman and Sarah Owen
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Specifically on this issue, I agree with the points made by the shadow Minister. I think that asking for a report into this issue is the most sensible way forward, rather than saying that we have got all the answers. Looking at this issue in the whole would be very important.

When my children were younger, we relied a lot on CBeebies; the kids spent a lot of time watching CBeebies rather than anything else. Now that they are a bit bigger, they have forayed into the world of YouTube; when we are considering content on these platforms, at least with CBeebies parents know for certain that there will be no swearing and nothing inappropriate on that channel. Not everything on it is necessarily educational, but it is all funny or good, whereas on YouTube there is an absolute load of nonsense at times, and there are a number of shows on Netflix or Disney+ about which I have had to say to my daughter, “No, you can’t watch that. It’s just nonsense.”

There is value in ensuring that children have access, and easy access, to appropriate content and in encouraging parents to ensure that their children are—well, having gone through the Online Safety Bill, I know that we need to ensure that parents are aware of what their children are consuming on the internet and aware of what they are watching, and that they are taking decisions to manage that content and to ensure that children have good access to it. If the public service broadcasters’ shows for children are more easily accessible, parents will have fewer issues in ensuring that those are the shows that their children see.

Lastly, I will give a wee plug for “Newsround”, which a significant number of schools show in school. It is incredibly important and a really key way in which children are able to access news content in an age-appropriate way that explains the background and the information that they are being provided with. Therefore, I agree entirely with the shadow Minister that it would be sensible to have a report on this issue, and that a watching brief definitely needs to be kept on it.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

Just to add to those points and those made by the shadow Minister, I have often relied on the third parent that is CBeebies, as I imagine many other Members and many of our constituents have as well. I want to talk about the quality of such television and about its educational impact on children, ranging from young children to teenagers.

As has been alluded to, the quality of the BBC’s programmes, particularly on CBeebies, is just a trusted fact. I know as a parent that I could quite happily leave my three-year-old in front of CBeebies. She does not love Peter Rabbit, but I know that it is a safe and secure watch for her. I know that there will be no inappropriate advertising or any inappropriate life lessons or swearing, which I cannot guarantee on other services or channels. There are brilliant CBeebies programmes and characters, such as Mr Tumble, “Bluey”, “Newsround”, which has already been mentioned, and “Dog Squad”, which is a new firm favourite.

As the shadow Minister said, most children now know their way around an iPad, a tablet, a computer or a phone like the back of their hand, and they access all this content in a way that we could not when we were younger, including through Netflix or YouTube. That is a particular concern, because the adverts on YouTube and other online streaming platforms are not always age appropriate. Particularly during the cost of living crisis and in the run-up to Christmas, that is another burden for parents to deal with. It is a huge annoyance that there is this reliance on advertising, and sometimes product placement, which is not always healthy for children, in movies and TV shows.

On the educational impact, I have concerns about how young children watch these programmes. There will need to be access to repeated viewings for the educational impact to be fully felt when it comes to things such as GCSE “Bitesize” or learning letters. One episode of “Yakka Dee!” or “Sesame Street” will not teach my child the entire alphabet. With that in mind, it is important that we have a review of the impact on young people to protect the quality and standards of children’s television.

ONLINE SAFETY BILL (First sitting)

Debate between Kirsty Blackman and Sarah Owen
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I beg to move amendment 98, in clause 11, page 10, line 17, at end insert

“, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Thank you, Sir Roger, for chairing this recommitted Bill Committee. I will not say that it is nice to be back discussing the Bill again; we had all hoped to have made more progress by now. If you will indulge me for a second, I would like to thank the Clerks, who have been massively helpful in ensuring that this quick turnaround could happen and that we could table the amendments in a sensible form.

Amendment 98 arose from comments and evidence from the Royal College of Psychiatrists highlighting that a number of platforms, and particularly social media platforms such as TikTok and Facebook, generally encourage habit-forming behaviour or have algorithms that encourage it. Such companies are there to make money—that is what companies do—so they want people to linger on their sites and to spend as much time there as possible.

I do not know how many hon. Members have spent time on TikTok, but if they do, and they enjoy some of the cat videos, for instance, the algorithm will know and will show them more videos of cats. They will sit there and think, “Gosh, where did the last half-hour go? I have been watching any number of 20-second videos about cats, because they constantly come up.” Social media sites work by encouraging people to linger on the site and to spend the time dawdling and looking at the advertisements, which make the company additional revenue.

That is good for capitalism and for the company’s ability to make money but the issue, particularly in relation to clause 11, is how that affects children. Children may not have the necessary filters; they may not have the ability that we have to put our phones down—not that we always manage to do so. That ability and decision-making process may not be as refined in children as in adults. Children can be sucked into the platforms by watching videos of cats or of something far more harmful.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

The hon. Member makes an excellent point about TikTok, but it also applies to YouTube. The platforms’ addictive nature has to do with the content. A platform does not just show a person a video of a cat, because that will not keep them hooked for half an hour. It has to show them a cat doing something extraordinary, and then a cat doing something even more extraordinary. That is why vulnerable people, especially children, get sucked into a dark hole. They click to see not just the same video but something more exciting, and then something even more exciting. That is the addictive nature of this.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The hon. Member is making fantastic, salient points. The damage with private messaging is around phishing, as well as seeing a really harmful message and not being able to unsee it. Would she agree that it is about protecting the victim, not putting the onus on the victim to disengage from such conversations?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

ONLINE SAFETY BILL (Second sitting)

Debate between Kirsty Blackman and Sarah Owen
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.