Broadband: Price

Lord Browne of Ladyton Excerpts
Thursday 2nd February 2023

(1 year, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Yes, it is very striking. Many people could be saving money and are not aware of it. That is why it is important that contracts are clear, but it also highlights the importance of consumer advice groups and, indeed, debates such as this, to draw the attention of people to the contracts they have signed.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- View Speech - Hansard - -

My Lords, of course everybody should read the contracts they sign, but has the Minister read his broadband provider’s contract? These contracts are impossible to understand. They have subcontracts and other regulations—there is no possibility that people will understand the contracts that they have to sign if they want broadband. What my noble friend describes is anti-competitive, inflationary and likely to drive down digital inclusion. This is a matter for the Competition and Markets Authority. The Minister should think about referring this to the Competition and Markets Authority for profiteering and setting up a cartel.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

At the risk of sounding like a geek, I have read my contract. I did so because some operators permit their customers to exit their contracts penalty-free when there is a price rise. Mine did; I looked at it, I shopped around and I saved some money. People would be well advised to do the same, but it is important that the industry tells people about the decisions it makes. That is why the Secretary of State brought chief executives in and asked them to consider carefully the impact of the decisions they make and how they communicate them.

Clearview AI Inc

Lord Browne of Ladyton Excerpts
Tuesday 5th July 2022

(2 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

We are looking at Mr Ryder’s report and recommendations. We have yet to assess them as they came out only recently but we think that the current framework offers strong protections.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, the scale of Clearview’s ambition is global: to have 100 billion face images on its database by next year. That is 14 images for every person presently on this planet. It also gave evidence recently of what it intends to do with this. It gave the Ukrainian Ministry of Defence free access to its software. I am not sure whether the Minister knows this, but the Ministry of Digital Transformation in Ukraine has said that it is using the technology to give Russians the chance to experience the true cost of the war by searching the web for images of dead Russians and contacting their families to say, “If you want to find your loved ones’ bodies, you’re welcome to come to Ukraine”. I can imagine what our attitude would be if that was happening in reverse. Are the Government aware that this company has ambitions well beyond what is within the jurisdiction of the ICO? It can be regulated only by Governments, and our Government should be at the forefront.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have seen the reports to which the noble Lord refers. As I said, our Information Commissioner’s Office has taken action, and so have its French, Italian, German, Canadian and Australian counterparts. I hope that that sends a clear message to companies such as Clearview that failure to comply with basic data protection principles will not be tolerated in the UK or, indeed, anywhere else. All organisations that process personal data must do so in a lawful, transparent and fair way.

AI in the UK (Liaison Committee Report)

Lord Browne of Ladyton Excerpts
Wednesday 25th May 2022

(2 years, 6 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, it is a significant pleasure to follow the noble Lord, Lord Holmes. I admire and envy his knowledge of the issue, but mostly I admire and envy his ability to communicate about these complex issues in a way that is accessible and, on occasions, entertaining. A couple of times during the course of what he said, I thought, “I wish I’d said that”, knowing full well that at some time in future I will, which is the highest compliment I can pay him.

As was specifically spelled out in the remit of the Select Committee on Artificial Intelligence, the issues that we are debating today have significant economic, security, ethical and social implications. Thanks to the work of that committee and, to a large degree, the expertise and the leadership of the noble Lord, Lord Clement-Jones, the committee’s report is evidence that it fully met the challenge of the remit. Since its publication—and I know this from lots of volunteered opinions that I have received since April 2018, when it was published—the report has gained a worldwide reputation for excellence. It is proper, therefore, that this report should be the first to which the new procedure put in place by the Liaison Committee, to follow up on the committee’s recommendations, should be applied.

I wish to address the issue of policy on autonomous weapons systems in my remarks. I think that it is known throughout your Lordships’ House that I have prejudices about this issue—but I think that they are informed prejudices, so I share them at any opportunity that I get. The original report, as the noble Lord, Lord Clement-Jones, said, referred to lethal autonomous weapons and particularly to the challenge of the definition, which continues. But that was about as far as the committee went. As I recollect, this weaponry was not the issue that gave the committee the most concern—but that was as far as it went, because it did not have the capacity to address it, saying that it deserved an inquiry of its own. Unfortunately, that has not yet taken place, but it may do soon.

The report that we are debating—which, in paragraph 83, comments on the welcome establishment of the Autonomy Development Centre, announced by the Prime Minister on 19 November 2020 and described as a new centre dedicated to AI, to accelerate the research, development, testing, integration and deployment of world-leading artificial intelligence and autonomous systems—highlighted that the work of that centre will be “inhibited” owing to the lack of alignment of the UK’s definition of autonomous weapons with the definitions used by international partners. The government response, while agreeing the importance of ensuring that official definitions do not undermine our arguments or diverge from our allies, responded further, and at length, by acknowledging that the various definitions relating to autonomous systems are challenging and, at length, set out a comparison of them.

Further, we are told that the Ministry of Defence is preparing to publish a new defence AI strategy that will allow the UK to participate in international debates and act as a leader in the space, and we are told that the definitions will be continually reviewed as part of that. It is hard not to conclude that this response alone justifies the warning of the danger of “complacency” deployed in the title of the report.

On the AI strategy, on 18 May the ministerial response to my contribution to the Queen’s Speech debate was, in its entirety, an assurance that the AI strategy would be published before the Summer Recess. We will wait and see. I look forward to that, but there is today an urgent need for strategic leadership by the Government and for scrutiny by Parliament as AI plays an increasing role in the changing landscape of war. Rapid advancements in technology have put us on the brink of a new generation of warfare where AI plays an instrumental role in the critical functions of weapons systems.

In the Ukraine war, in April, a senior Defense Department official said that the Pentagon is quietly using AI and machine-learning tools to analyse vast amounts of data, generate useful battlefield intelligence and learn about Russian tactics and strategy. Just how much the US is passing to Ukraine is a matter for conjecture, which I will not engage in; I am not qualified to do so anyway. A powerful Russian drone with AI capabilities has been spotted in Ukraine. Meanwhile, Ukraine has itself employed the use of controversial facial recognition technology. Vice Prime Minister Fedorov told Reuters that it had been using Clearview AI—software that uses facial recognition—to discover the social media profiles of deceased Russian soldiers, which authorities then use to notify their relatives and offer arrangements for their bodies to be recovered. If the technology can be used to identify live as well as dead enemy soldiers, it could also be incorporated into systems that use automated decision-making to direct lethal force. That is not a remote possibility; last year the UN reported that an autonomous drone had killed people in Libya in 2020. There are unconfirmed reports of autonomous weapons already being used in Ukraine, although I do not think it is helpful to repeat some of that because most of it is speculation.

We are seeing a rapid trend towards increasing autonomy in weapons systems. AI and computational methods are allowing machines to make more and more decisions themselves. We urgently need UK leadership to establish, domestically and internationally, when it is ethically and legally appropriate to delegate to a machine autonomous decision-making about when to take an individual’s life.

The UK Government, like the US, see AI as playing an important role in the future of warfighting. The UK’s 2021 Integrated Review of Security, Defence, Development and Foreign Policy sets out the Government’s priority of

“identifying, funding, developing and deploying new technologies and capabilities faster than our potential adversaries”,

presenting AI and other scientific advances as “battle-winning technologies”—in what in my view is the unhelpful context of a race. My fear of this race is that at some point the humans will think they have gone through the line but the machines will carry on.

In the absence of an international ban, it is inevitable that eventually these weapons will be used against UK citizens or soldiers. Advocating international regulation would not be abandoning the military potential of new technology, as is often argued. International regulation on AWS is needed to give our industry guidance to be a sci-tech superpower without undermining our security and values. Only this week, the leaders of the German engineering industry called for the EU to create specific law and tighter regulation on autonomous and dual-use weapons, as they need to know where the line is and cannot be expected to draw it themselves. They have stated:

“Imprecise regulations would do damage to the export control environment as a whole.”


Further, systems that operate outside human control do not offer genuine or sustainable advantage in the achievement of our national security and foreign policy goals. Weapons that are not aligned with our values cannot be effectively used to defend our values. We should not be asking our honourable service personnel to utilise immoral weapons—no bad weapons for good soldiers.

The problematic nature of nonhuman-centred decision-making was demonstrated dramatically when the faulty Horizon software was used to prosecute 900-plus sub-postmasters. Let me explain. In 1999, totally coincidentally at the same time as the Horizon software began to be rolled out in sub-post offices, a presumption was introduced into the law on how courts should consider electronic evidence. The new rule followed a Law Commission recommendation for courts to presume that a computer system has operated correctly unless there is explicit evidence to the contrary. This legal presumption replaced a section of the Police and Criminal Evidence Act 1984, PACE, which stated that computer evidence should be subject to proof that it was in fact operating properly.

The new rule meant that data from the Horizon system was presumed accurate. It made it easier for the Post Office, through its private prosecution powers, to convict sub-postmasters for financial crimes when there were accounting shortfalls based on data from the Horizon system. Rightly, the nation has felt moral outrage: this is in scale the largest miscarriage of justice in this country’s history, and we have a judiciary which does not understand this technology, so there was nothing in the system that could counteract this rule. Some sub-postmasters served prison sentences, hundreds lost their livelihoods and there was at least one suicide linked to the scandal. With lethal autonomous weapons systems, we are talking about a machine deciding to take people’s lives away. We cannot have a presumption of infallibility for the decisions of lethal machines: in fact, we must have the opposite presumption, or meaningful human control.

The ongoing war in Ukraine is a daily reminder of the tragic human consequences of ongoing conflict. With the use of lethal autonomous weapons systems in future conflicts, a lack of clear accountability for decisions made poses serious complications and challenges for post-conflict resolution and peacebuilding. The way in which these weapons might be used and the human rights challenges they present are novel and unknown. The existing laws of war were not designed to cope with such situations, any more than our laws of evidence were designed to cope with the development of computers and, on their own, are not enough to control the use of future autonomous weapons systems. Even more worrying, once we make the development from AI to AGI, they can potentially develop at a speed that we humans cannot physically keep up with.

Previously in your Lordships’ House, I have referred to a “Stories of Our Times” podcast entitled “The Rise of Killer Robots: The Future of Modern Warfare?”. Both General Sir Richard Barrons, former Commander of the UK Joint Forces Command, and General Sir Nick Carter, former Chief of the Defence Staff, contributed to what, in my view, should be compulsory listening for Members of Parliament, particularly those who hold or aspire to hold ministerial office. General Sir Richard Barrons says

“Artificial intelligence is potentially more dangerous than nuclear weapons.”


If that is a proper assessment of the potential of these weapon systems, there can be no more compelling reason for their strict regulation and for them to be banned in lethal autonomous mode. It is essential that all of us, whether Ministers or not, who share responsibility for the weapons systems procured and deployed for use by our Armed Forces, fully understand the implications and risks that come with the weapons systems and understand exactly what their capabilities are and, more importantly, what they may become.

In my view, and I cannot overstate this, this is the most important issue for the future defence of our country, future strategic stability and potentially peace: that those who take responsibility for these weapons systems are civilians, that they are elected, and that they know and understand them. Anyone who listens to the podcast will dramatically realise why, because already there are conversations going on among military personnel that demand the informed oversight of politicians. The development of LAWS is not inevitable, and an international legal instrument would play a major role in controlling their use. Parliament, especially the House of Commons Defence Committee, needs to show more leadership in this area. That committee could inquire into what military AI capabilities the Government wish to acquire and how these will be used, especially in the long term. An important part of such an investigation would be consideration of whether AI capabilities could be developed and regulated so that they are used by armed forces in an ethically acceptable way.

As I have already referred to, the integrated review pledged to

“publish a defence AI strategy and invest in a new centre to accelerate adoption of this technology”.

Unfortunately, the Government’s delay in publishing the AI defence strategy has cast doubt on the goal stated in the integrated review’s commitment of security, defence, development and foreign policy that the UK will become a “science and technology superpower”. The technology is already outpacing us, and presently the UK is unprepared to deal with the ethical, legal and practical challenges presented by autonomous weapons systems. Will that change with the publication of the strategy and the establishment of the autonomy development centre? Perhaps the Minister can tell us.

Gambling Commission: Data

Lord Browne of Ladyton Excerpts
Wednesday 20th October 2021

(3 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, I have given notice of my question. Recent research reveals a correlation in one in four gamblers between higher rates of gambling spend as a proportion of income and gambling harm. This challenges the Government’s oft-repeated view that

“the vast majority of people who gamble do not experience harm”.—[Official Report, 7/1/21; col. 281.]

The Minister’s predecessor dismissed this research when I brought it to her attention, because it does not establish a causative link between gambling spend and gambling harm. Surely the correct response is for the Minister to engage with this research and expand upon it to see whether it can prove that link, rather than dismissing it and preferring surveys of high-risk gamblers.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord for the advance notice; it gave me an opportunity to look at his Written Question and the reply from my noble friend. I do not think she was dismissing what he said. This is simply a product of what is still, as I have said, an emerging area in which data and research are being gathered. Dr Naomi Muggleton’s research has been an important contribution to our efforts to understand the widening impacts of gambling harm. Our review is looking at the barriers to conducting high-quality research such as this, which can inform our policy. Following the publication of the PHE review which we debated last week, we are working with the DHSC and others to complete that picture and improve the data and research we have.

Gambling

Lord Browne of Ladyton Excerpts
Thursday 7th January 2021

(3 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

The noble Lord is right to raise the issue of gambling during lockdown. The evidence of an increase is not as clear-cut as he suggests. We are concerned and have taken very prompt action, including requiring operators to intervene in online gambling sessions lasting more than an hour, and increasing affordability checks.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab) [V]
- Hansard - -

My Lords, problem gambling disproportionately affects vulnerable groups, exacerbates social inequalities and imposes large economic costs on society. Back in 2017, the Gambling Commission described it as a public health concern. Does the Minister agree that if problem gambling is to be taken seriously as a public health issue, policy responsibility for prevention and treatment should primarily lie with the Department of Health and Social Care and not the DCMS, a department described by the Public Accounts Committee as both slow and weak on this subject?

Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

We do not see ourselves as, and nor are the officials working in this area, slow or weak. As the noble Lord knows, the Department of Health and Social Care is responsible for the Government’s addiction strategy across all forms of addition. He will be aware of the comorbidity between different forms of addiction, and there are other aspects of gambling. We know that the vast majority of people who gamble do not experience harm, and that is the balance the department is trying to strike: to reduce the harm, and to allow those who gamble safely to do so.

Historic Statues

Lord Browne of Ladyton Excerpts
Monday 19th October 2020

(4 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

The noble Baroness gives some very helpful examples. The Government share her concern, particularly at some of the scenes we have seen recently, which have been deeply troubling. It is very unfortunate when figures such as Churchill have to be boarded up to avoid desecration. The Government continue to prioritise this.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab) [V]
- Hansard - -

Webster’s Dictionary’s definition of putting someone on a pedestal is

“to think of someone as a perfect person with no faults: to admire someone greatly”.

The erection of a statue is not an objective act, but a subjective judgment of an individual’s historical contribution. Does the Minister agree that just as the civic leadership of communities most often decided who should have a statue placed on a pedestal in public places, their modern equivalents, not Ministers, should be trusted to decide whose statues are representative of a community’s current values?

Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

Obviously local authorities are primarily responsible in this area and will take the view of their community into consideration, but my understanding is that for the most contested examples there has been not a uniform community view, but a divided one.

Covid-19: Artificial Intelligence

Lord Browne of Ladyton Excerpts
Wednesday 9th September 2020

(4 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

This Government want to be a leader in the regulation of AI, balancing a pro-growth, pro-innovation economy with one upholding the highest ethical standards.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab) (V)
- Hansard - -

My Lords, the endless development delays to the launch of the Covid-19 contact-tracing app, the Home Office decision, ahead of a judicial review, to scrap the controversial visa applications AI system, which was biased in favour of white applicants, and—most embarrassing of all—the A-level exam results scandal have all reinforced barriers to AI take-up as identified in this report, so what lessons have the Government learned from these fiascos?

Gambling Advertising

Lord Browne of Ladyton Excerpts
Thursday 25th June 2020

(4 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Barran Portrait Baroness Barran
- Hansard - - - Excerpts

My noble friend brings great experience to this, including from his time as a Minister at the Home Office. There are no plans currently to move responsibility for gambling to the Home Office, although my department works very closely with the Home Office and others in overseeing this. In relation to my noble friend’s comments about social media, work is going on specifically on that area to make sure that adverts are not targeted at people under 25 or at children. We are working actively with the platforms to ensure that gambling ads do not appear for those who have self-excluded from gambling.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab) [V]
- Hansard - -

My Lords, almost a year ago on 2 July, in a parliamentary Statement, the Government announced three measures agreed with gaming companies to

“deliver real and meaningful progress on support for problem gamblers”.—[Official Report, 2/7/19; col. 1345.]

The noble Lord, Lord Ashton of Hyde, said the Government expected change and, if it did not manifest, would take other measures and did not rule out legislation. Is the Government’s judgment that the industry’s actions are delivering real, meaningful progress? What metrics are the Government using, and will they publish their calculations?

Baroness Barran Portrait Baroness Barran
- Hansard - - - Excerpts

The main metric that the Government use to measure the extent of problem gambling is the British Gambling Prevalence Survey, which looks at population levels of problem gambling. That has remained unchanged over 20 years, at slightly below 1%. I appreciate the context of the noble Lord’s question: with the prevalence of gambling advertising and promotion, intuitively one would expect that figure to rise, but there is not evidence for that at the moment.

Algorithms: Public Sector Decision-making

Lord Browne of Ladyton Excerpts
Wednesday 12th February 2020

(4 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord. At the heart of his speech he made a point that I violently agree with: the pace of science and technology is utterly outstripping the ability to develop public policy to engage with it. We are constantly catching up. This is not a specific point for this debate, but it is a general conclusion that I have come to. We need to reform the way in which we make public policy to allow the flexibility, within the boxes of what is permitted, for advances to be made, but to remain within a regulated framework. But perhaps that is a more general debate for another day.

I am not a graduate of the Artificial Intelligence Select Committee. I wish I had been a member of it. When its very significant and widely recognised as great report was debated in your Lordships’ House, I put my name down to speak. I found myself in a very small minority of people who had not been a member of the committee, but I did it out of interest rather than knowledge. It was an extraordinary experience. I learned an immense amount in a very short time in preparing a speech that I thought would keep my end up among all the people who had spent all this time involved in the subject. I did the same when I saw that the noble Lord, Lord Clement-Jones, had secured this debate, because I knew I was guaranteed to learn something. I did, and I thank him for his consistent tutoring of me by my following his contributions in your Lordships’ House. I am extremely grateful to him that he secured this debate, as the House should be.

I honestly was stunned to see the extensive use of artificial intelligence technology in the public services. There is no point in my trying to compete with the list of examples the noble Lord gave in opening the debate so well. It is being used to automate decision processes and to make recommendations and predications in support of human decisions—or, more likely in many cases, human decisions are required in support of its decisions. A remarkable number of these decisions rely on potentially controversial data usage.

That leads me to my first question for the Minister. To what extent are the Government—who are responsible for all of this public service in accountability terms—aware of the extent to which potentially controversial value judgments are being made by machines? More importantly, to what degree are they certain that there is human oversight of these decisions? Another element of this is transparency, which I will return to in a moment, but in the actual decision-making process, we should not allow automated value judgments where there is no human oversight. We should insist that there is a minimum understanding on the part of the humans of what has promoted that value judgment from the data.

I constantly read examples of decisions being made by artificial intelligence machine learning where the professionals who are following them are unable to explain them to the people whose lives are being affected by them. When they are asked the second question, “Why?”, they are unable to give an explanation because the machine can see something in the data which they cannot, and they are at a loss to understand what it is. In a medical situation, there are lots of black holes in the decisions that are made, including in the use of drugs. Perhaps we should rely on the outcomes rather than always understanding. We probably would not give people drugs if we knew exactly how they all worked.

So I am not saying that all these decisions are bad, but there should be an overarching rule about these controversial issues. It is the Government’s duty at least to know how many of these decisions are being made. I want to hear an assurance that the Government are aware of where this is happening and are happy about the balanced judgments that are being made, because they will have to be made.

I push unashamedly for increased openness, transparency and accountability on algorithmic decision-making. That is the essence of the speech that opened this debate, and I agree 100% with all noble Lords who made speeches of that form. I draw on those speeches and ask the Government to ensure that where algorithms are used, this openness and transparency are required and not just permitted, because, unless it is required, people will not know why decisions about them have been made. Most of those people have no idea how to ask for the openness that they should expect.

Problem Gambling

Lord Browne of Ladyton Excerpts
Tuesday 2nd July 2019

(5 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The right reverend Prelate, who has been vociferous in his views on this—I have been on the receiving end for several years—has done good work, but he is overexaggerating slightly. On the increase in the proportion of problem gamblers receiving treatment, we will never reach 100%. They have to agree to be part of it. We have significantly increased the resources available to do it. We had one gambling clinic in the east; another has just opened, specialising in children. We have announced plans to open 14, and today’s announcement is in addition to that. In every other sphere of potentially harmful industries, such as smoking and alcohol, the industry pays taxes and the treatment of people affected by those industries is paid for out of general taxation. The gambling industry pays £3 billion in gambling tax plus income tax and NI. In addition to that, the top five companies have agreed to pay a 1% levy on that to fund treatment. They are producing a large amount of money. Because it will be transparent, we will monitor what needs to be done, but this is a dramatic increase in resources in a very short time and it will make a significant difference.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- Hansard - -

My Lords, betting advertising around televised sport, both live and recorded, has reached saturation point. The dominant sports broadcaster in this country is part of a bigger business that has a sporting and gaming division and it advertises on the platform on which the sport is broadcast. The sports advertisers use sports presenters and sports pundits to advertise, and they advertise live odds on the events that are happening and being broadcast. All this is aimed at—and has succeeded in—blurring the difference between the advertising and the event itself. It all looks like the one thing and it works for them. In this context, therefore, what on earth is expected to be delivered by a voluntary ban on advertising around live sport only during the daytime? What do the Government expect this to deliver in reducing this problem, and why did they agree to it if they did not expect it to make a real difference?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I do not see why the noble Lord thinks that the proposal will not make a difference, but it is in addition to other areas. It works in sync with the fact that there is now agreement to use online technology to target gambling advertisements away from people identified as being at risk of problem gambling. Responsible gambling messaging will be increased and the tone and content of marketing will be reviewed. That is an addition to the previous commitment that the noble Lord mentioned of a whistle-to-whistle ban and the funding of a new multimillion-pound responsible gambling advertising campaign led by GambleAware. We are asking gambling firms to act responsibly. Where they do not, we will continue to talk to them as we have—the results of which have come today. We are not, however, ruling out legislation. We expect change and we expect firms to behave responsibly, but if they do not we will have to take other measures.