Online Filter Bubbles: Misinformation and Disinformation

Damian Collins Excerpts
Tuesday 16th January 2024

(10 months, 3 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I congratulate my hon. Friend the Member for Weston-super-Mare (John Penrose) on securing the debate. It could not be more important or timely; as he alluded to in his speech, half the world is voting this year. We have already seen some of those elections take place in Taiwan and Bangladesh, and in America the Republican party had the Iowa caucus last night. It is interesting to see a demonstration of the impact of disinformation on ordinary people going about their daily lives. It has been heavily reported in America that 60% of Republicans voting in the caucus believe that the last presidential election was stolen and illegitimate and that Donald Trump was elected as President.

The challenge of disinformation is not just from foreign state interference. When we first started talking about the issue some five or six years ago, we were looking principally at the influence networks of Russia and Iran and their ability to try to reshape the way in which people saw the world and the institutions in their own countries to sow fear and discord and make people distrust the institutions of their own society, the legitimacy of their courts, the freedom of their elections and the truth of their media. However, it is happening in our society as well. The pandemic was a demonstration of the potency of conspiracy networks such as QAnon to persuade people that the vaccine was not safe, and we see it today to persuade people that our public institutions and elections are not safe. It is being done to undermine the fabric of democracy. There is a lot more to being a democracy than holding elections, and having faith in our institutions, trusting our media and trusting the news and information that we get are all essential to the citizen’s job of casting their vote every four or five years to determine who should run their country. If that is attacked and undermined, it is an attack on our entire democratic way of life. This year, we will see that challenge in a way that we have not seen before, with a level of technical sophistication that we have not seen before, and we should be concerned about it.

I will respond briefly to the remarks by the hon. Member for Glasgow South (Stewart Malcolm McDonald) in his speech. I was briefly the Minister responsible for the Counter Disinformation Unit, and I thought that I had better meet it, because it is not a particularly public-facing organisation, to see what it had to say. The Government quite rightly have different strategies for dealing with disinformation across Government: some of it is led by policing and security; some of it is led by looking at bad actors internally; and some of it is led by the Foreign Office and Ministry of Defence looking at bad actors externally. The Government should trigger different responses: some that respond with news and information that challenge conspiracy theories and networks, and some that identify networks of disinformation being controlled and operated by foreign states against which we want companies and platforms to take action. That was included in the National Security Act 2023 last year, and the Online Safety Act places a further obligation on companies to act in response to intelligence reports that they receive. If they do not take action against those known networks of disinformation controlled and run by hostile foreign states, action can be taken against the companies as well.

That is why the Online Safety Act was so important; it creates, for the first time, the principle of liability of platforms for the information that they distribute and promote to other users. Central to the debate on the Bill that became the Online Safety Act was finally answering the false question that was posed all the time: are platforms, such as Facebook, actually platforms or publishers? They do not write the content, but they do distribute it. People have first amendment rights in America to speak freely, and we have freedom of speech rights in this country—that is not the same as the right actively to be promoted to millions of people on a social media platform. They are different things. The companies promote content to users to hold their attention, drive engagement and increase advertising revenue. It is a business decision for which they should be held to account, and the Online Safety Act now gives a regulator the power to hold companies to account for how they do that.

I listened carefully to what my hon. Friend the Member for Weston-super-Mare said about whether we could borrow from the broadcasting code to try to create standards. Can we break filter bubbles by trying to give people access to different sorts of information? I think this is a difficult area, and there are subtle differences between a broadcaster and a social media platform. It is true that they both reach big audiences. It is also true that social media platforms exercise editorial decisions, just like a broadcaster does. However, the reason why it was so important for broadcasting and broadcasting licences to make sure that there were fair standards for balance and probity was that there were not that many broadcasters when the licences were introduced. The list has now grown. People tuning in do not necessarily know what they will get, because the content is selected and programmed by the programme maker and the channel.

I would say that social media have become not broadcast media, but the ultimate narrowcast media, because the content to which people are being exposed is designed for them. An individual’s principal experience of being on social media is not of searching for things, but of having things played and promoted to them, so the responsibility should lie with companies for the decisions they make about what to promote. There is nothing wrong with people having preferences—people have preferences when they buy a newspaper. I am sure that when the hon. Member for Strangford (Jim Shannon) watches services by Rev. Ian Paisley on YouTube, he does not want to get a prompt saying, “You’ve had enough this week. We’re going to give you some content from the Sinn Féin party conference.” We do not want that kind of interference going on. People have perfectly legitimate viewing habits that reflect their own preferences. The question is, do platforms push and actively promote conspiracy theories and fake news? I think they do, and there is evidence that they have done so.

I will mention one of the clearest examples of that in the brief time I have left. In the 2020 US presidential election, the platforms agreed, under pressure, to give far greater prominence to trusted news sources in their newsfeeds, so that people were far more likely to see content from a variety of different broadcasters. It was not necessarily all from CNN or Fox News—there could be a variety—but it was from known and legitimate news sources as a first preference. The platforms downgraded what they call civic groups, which are the friends and family groups that are often the breeding ground for conspiracy theories. One reason why they often spread so quickly is that people push them on their friends, who look at such content because it has come from someone they know and trust. However, when the platforms changed the ranking and promotion factor, it had a big impact: it dampened down disinformation and promoted trusted news sources, but it also reduced engagement with the platform. After the election, Facebook reversed the change and the conspiracy theorists were allowed to run riot again, which was a contributing factor in the insurrection we saw in Washington in January 2021.

Companies have the ability to make sure that fair and trusted news gets a better crack, which is absolutely essential in this digital age. They should be very wary about allowing AI to use content and articles from legitimate news organisations as training data to create what would effectively become generic copies to sell advertising against, steering people away from journalism that people have to pay for and towards free content that looks very similar but is far less likely to be trustworthy. We need to get the news bargaining code right so that proper news organisations do not see their content being distributed for free, and ads sold against it by other people, without getting fair remuneration. These are things we can do to protect our news ecosystem, and the Online Safety Act is essential for making sure that Ofcom holds companies to account for actively promoting known sources of conspiracy theories and disinformation. It is important to tackle the big threat to democracy, just as it is important to combat fraud and protect citizens from financial harm.

None Portrait Several hon. Members rose—
- Hansard -

Saqib Bhatti Portrait Saqib Bhatti
- Hansard - - - Excerpts

My right hon. Friend makes an important point. As I make progress, I hope he will be reassured that the regime will take both those things into account.

Together, amendments 13 and 14 will make sure that consumers get the best outcomes. Amendment 14 makes an important clarification on the role of third parties in the final offer mechanism process. New clause 5 and related amendments will clarify when and how third parties may make collective submissions in relation to the final offer mechanism. That is vital, as collective bargaining can help to address power imbalances during negotiations. We expect that third parties, especially smaller organisations, may seek to work together when negotiating payment terms and conditions.

My second theme is the accountability of the regulator. The discretion afforded to the CMA and its accountability to Government and Parliament have formed a large part of the debate—quite rightly—during the passage of the Bill. I will take time to address that.

The digital markets regime is flexible in its design, with the CMA requiring a level of discretion to deliver effective outcomes. While that is common for ex ante regulation, that does not negate the importance of taking steps to maximise the predictability and proportionality of the regulator’s actions. For that reason, the Government are introducing an explicit requirement for the CMA to impose conduct requirements and pro-competition interventions only where it considers that it is proportionate to do so.

That will make it clear to firms in scope of the regime that they will not be subject to undue regulatory burdens. Firms will be able to challenge disproportionate obligations, and the Competition Appeal Tribunal will, in its consideration of any appeals, apply the principle of proportionality in a reasonable way, as it always does. To complement that, and to ensure consistent senior oversight and accountability of the regime, amendments 57 to 60 require enforcement decisions, including the imposition of penalties, to be reserved to the CMA board or its committee.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

I welcome my hon. Friend to his position, and congratulate him on his role. The Government amendments relate to the proportionality test for conduct requirements. Why did the Government feel that there was a need for those additional tests? Was there a concern that the CMA would use the power disproportionately, and if so, what might such a use have been?

Saqib Bhatti Portrait Saqib Bhatti
- Hansard - - - Excerpts

I thank my hon. Friend for his contribution to the House on these matters, and for that question. The aim of the amendments is to provide clarity and give certainty—clarity that we will always ensure that the consumer is at the heart of what we do, and certainty because that is what business always needs. I will happily give further clarity in my closing remarks. To ensure robust oversight of the DMU’s implementation of the regime, we are also requiring that the Secretary of State approve the publication of guidance relating to part 1 of the Bill.

--- Later in debate ---
Robert Buckland Portrait Sir Robert Buckland
- Hansard - - - Excerpts

That is what we need to bottom out. The primary worry that a lot of us have about the JR principle is that it means that any challenge will probably be vanishingly small, which is not good for ensuring that the regulator is working in the best way. None of us wants to encourage incontinent litigation—or incontinent legislation, bearing in mind the importance that we place on it—but sometimes, challenge is essential to create greater certainty. There will be ambiguities; there will be occasions where there needs to be a test. We should not be frightened of that.

Damian Collins Portrait Damian Collins
- Hansard - -

I am following what my right hon. and learned Friend says carefully. Does he agree that we have to consider the nature of this business landscape? For these firms—some of the biggest companies in the world—litigation is a cost of doing business. Their track record shows that they use almost all grounds there are to challenge any decision made by any regulator. Not even a regulator is resourced sufficiently to be able to contest those challenges, and the people who seek to bring them know that they will take years and cost a huge amount of money, and that the business may even be closed by the time a resolution has been found.

Robert Buckland Portrait Sir Robert Buckland
- Hansard - - - Excerpts

I fully take on board my hon. Friend’s concern. He is right to say that, which is why this should not just be about what might happen in terms of raw dispute; it has to be the culture of the new regulator to work with any potential subject—any company that might be a subject of an investigation—in a co-operative way. That raises the issue of how open the parties are with each other about the basis of their assertions and of how data is shared—that goes right into the Competition Appeal Tribunal itself. A lot of people would be surprised that the disclosure rules in the CAT are not as open as one would expect them to be if one is challenging a decision. We have to work our way through that, in order to change that attitude and reduce the amount of potential litigation by making sure that there is agreement.

I accept that the Government have moved on the JR test with regard to penalty, but a potential problem could result from the Government’s amendment on that: there will not be a change of culture, there will be a readiness by big tech to admit breach and then all resources will be thrown into contesting the penalty. There we will get the litigation, the real argument and the high-stakes money. To paraphrase my hon. Friend, we will get the actuarial calculation that it would be worth throwing a lot of money at litigation to reduce a penalty that could be a big percentage of turnover. We are potentially talking about huge penalties for these companies.

That issue does worry me and I hope that it demonstrates to the House why I am properly sensitive about the need to make sure that we do not just open the door to abuse by another means. I am a huge follower of Theodore Roosevelt and a great believer that his approach to fighting the J.P. Morgans and the Standard Oils of his day is exactly how we should operate in the monopolistic markets of today and tomorrow. My hon. Friend is right to say that this market is fast developing. When the Furman report was produced, we were looking at a different world in big tech. With the rise of artificial intelligence, we are seeing it evolve further.

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my right hon. and learned Friend for giving way, particularly as we are on the subject of Theodore Roosevelt. Does he agree that we have to be careful when considering consumer detriment in this case? The argument was not successfully made in the United States that J.P. Morgan could say that he may have a railway monopoly but the ticket prices were relatively low and so there was no consumer detriment. That was not considered to be a binding argument, so because the cost of an app in an app store might be low, that does not mean to say that the company can get away with overcharging.

Robert Buckland Portrait Sir Robert Buckland
- Hansard - - - Excerpts

Again, I am grateful to my hon. Friend. He is right: there is a danger that in regulation we focus on the cost of the good or service, rather than on the overall environment and quality of the market. Some would say that that has been a particular issue in the way that regulation has operated in the water sector. That is why this is a good moment for all of us, as a House, to pause and reflect on where we have gone wrong with regulation in the past and how we can get it right from here on in.

There are some options the Government can look at when dealing with the JR standard. I have mentioned the importance of making sure that there is accountability, but we should not just be looking at the sunset option that I have set out in my amendment; we should look again at whether the clarification of the proportionality test could help everybody to understand precisely how the JR principles will work. If we miss the opportunity on this occasion to get this right, I am not sure we will be doing anybody any favours, least of all the consumer and especially not the DMU itself, which needs to develop in a way that is truly accountable.

The thrust of some of my amendments relates to the regulator’s accountability to this place, which is why they include a requirement to report regularly to Parliament and to Ministers. New clause 12 relates to the appointment of the senior director of the DMU, which I think should be done directly by the Secretary of State. That is not a challenge to the independence of the body; Ministers regularly appoint independent directors and inspectors, for example, and it does not undermine the integrity and quality of their role. However, through those amendments I am seeking to make the case that we should not confuse independence for lack of accountability. I do not use that word as a way of avoiding a greater accountability to this place.

--- Later in debate ---
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

Listening to this debate, I was reminded of remark attributed to a major United States tech investor who said that it had always amused him that people thought competition and capitalism were the same thing. While competition can be a great driver of economic growth, the acquisition of capital and the creation of new markets, there are equally plenty of capitalistic enterprises that have grown wealthy on the back of a lack of competition, through market domination. That is why this legislation is so important.

Superficially, it is tempting to look at the landscape of the digital economy and say that the fact that there are a number of very big companies is evidence of effective competition between those companies. Those companies, including Amazon, Apple, Google and Meta, may compete for the provision of some services, but they largely dominate markets where they are the central player. We have heard throughout the passage of the Bill that even major businesses seeking to sell their goods through, say, Amazon as an online retail platform cannot afford to have a public dispute with that platform, because their relationship with that company is fundamental to the success of their business. Major publishing companies have talked about the fact that contract renegotiation with companies such as Amazon can come with big costs attached, but that ultimately they have to do business through them.

Cloud storage, which is currently an area of investigation for the CMA, is going to be a vital piece of business infrastructure for anyone who operates in the digital economy, but again, it is dominated by one or two companies, principally Google and Amazon. There are only two operating systems for our mobile phone devices. One is Android, which is owned by Google; the other is Apple’s iOS system. They both have app stores, and there is a lack of interoperability between them. We therefore have app store markets that are actual monopolies. This has been investigated by the CMA and it has billions of pounds of consumer detriment in overpricing and variable pricing attached to it.

We know that these anti-competitive forces exist. In its recent ruling on the proposed Microsoft-Activision merger, the CMA was right to highlight that if a company that creates video games that people like to play is allied to a cloud system owned by a dominant company, people might only be able to access the service if they pay that cloud provider—the storage gatekeeper or guardian of that service—which could have consumer detriment down the line.

We are already seeing examples such as market domination and self-preferencing. Google has been investigated by the European Commission over self-preferencing. This is where companies are not just creating an easy-to-use service across multiple products for people, but doing so in a way that excludes others from that market. In the long run we must be concerned about the consumer detriment of market power being consolidated into the hands of a relatively small number of companies. An example that Members will probably all be familiar with is the mobile mapping app market. It used to be quite a vibrant market with a number of players in it, but it is now largely dominated by two, Google and Apple. That is not to say that the interest of companies is always against the consumer interest, but we should be mindful of the fact that in many of these markets, monopolistic conditions can easily be created, so we should be concerned about abusive market power. There is already some evidence of that.

--- Later in debate ---
Saqib Bhatti Portrait Saqib Bhatti
- Hansard - - - Excerpts

That is an important point, and I appreciate my right hon. and learned Friend giving me the opportunity to clarify it. I want to be unequivocal that, from my perspective, the threshold is still high and we have provided clarify. If he requires even further clarity, I am happy to write to him to be completely clear.

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful for what my hon. Friend has said so far about the application of the proportionality test, but if he is to follow up with Members in writing with some clarity, can he set out what he believes the grounds for challenge would be on the basis of proportionality? The interventions that the CMA may make and the rulings it may give are at the end of quite a lengthy process of market analysis, demonstration of abuse of market power and breach of conduct requirements. If those are challenged routinely and at a late stage, on the basis that there are grounds to say that it is disproportionate, it could have the unintended consequence of delaying systems in a way that they should not be delayed.

Saqib Bhatti Portrait Saqib Bhatti
- Hansard - - - Excerpts

If I heard my hon. Friend correctly, he wanted a letter on that. This legislation is designed to make sure that it is not for big companies to litigate heavily to stifle the smaller challengers from coming out and becoming the big companies and employers of tomorrow. Let me write to him to clarify the point further.

My right hon. and learned Friend the Member for South Swindon has spoken about accountability in my numerous conversations with him over the past few days, and again today. I take his point. He will know that I want independent, versatile, flexible and adaptable regulators. That is only right for an ever-changing digital market that is always innovating and changing the way it operates. We do not know the unicorns of tomorrow or the benefits that we can get from consumers. The Competition and Markets Authority and the DMU have a responsibility to be accountable, to maintain that flexibility and to have adaptability to new technology and new entrants in the market. As I am sure he knows and respects, that is why independent regulators are a central part of our internationally recognised business environment. We should not forget that point.

I take the points about overreach by regulators, but they are a core part of what international partners and investors look at when it comes to the competition regime, because they know that will be innovative and will encourage further innovation in technology. The CMA is operationally independent from Government, and Government will not intervene in its regulatory decisions. The DMU will have discretion in how it designs its interventions under the regime. That discretion is matched with robust accountability, from initial decision making to appeals.

There is a range of checks and balances throughout the regime that provide assurance. I hope that reassures my right hon. Friend. There are opportunities for Government, Parliament and stakeholders to hold the CMA to account, but I welcome his challenges and interventions on this point, because it is important. I am sure that this will be looked at again in the other place. Government should always be sensitive to those challenges. The digital markets regime will be overseen by CMA’s board, which is accountable to Parliament for all key decisions. Key decisions will be taken by a committee, of which at least half its members will offer an independent perspective. I am sure that he will welcome that because, as new technologies and innovations emerge in the market, we will need new expertise.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. and learned Friend the Member for South Swindon (Sir Robert Buckland) made the important point that the growth and expansion of regulation in digital markets is necessary but substantial. The ability of this place to keep track of how the regulators use their powers is increasingly important. That may be beyond the work of any departmental Select Committee, but instead requires something like the Public Accounts Committee, as he suggested—a separate committee whose job is to focus on and scrutinise such work. That was recommended by the House of Lords Communications and Digital Committee, and also by the Joint Committee on the Online Safety Bill. I do not expect the Minister to give us an answer right now, but if he could reflect on that need and give some guidance to the House, that would be welcome.

Saqib Bhatti Portrait Saqib Bhatti
- Hansard - - - Excerpts

My hon. Friend makes an important point that is a matter for wider discussions on accountability. I am happy to have that discussion with him in future. As things currently stand, there are sufficient balances and checks in place, but I am always open to having further discussions with him.

Online Safety Bill

Damian Collins Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.

Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.

We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.

We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.

--- Later in debate ---
This is not the perfect Bill. This is not necessarily the Bill that I would have liked to see. It has gone through so many changes and iterations over the time we have been trying to scrutinise it that some of it has gone back to what it previously looked like, except the harmful content in relation to adults. I am pleased that the internet will be a safer place for our children and our children’s children. I am pleased that they will have more protections online. I have an amount of faith and cautious optimism in the work of Ofcom, because of how fast it has been scaling up and because of the incredible people it has employed to work there—they really know what they are talking about. I wish the Government and Ofcom every success in ensuring that the Bill is embedded and ensuring that the internet is as safe as possible. I would just really like a commitment from the Minister on ensuring that this legislation is kept under proper review and that legislative change will be made, should we identify any loopholes.
Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

The draft Bill was published in April 2021, so it is fantastic that we are now discussing its final stages after it has gone through its processes in the House of Lords. It went through pre-legislative scrutiny, then it was introduced here, committed to the Bill Committee, recommitted, came back to the House, went to the Lords and came back again. I do not think any Bill has had as much scrutiny and debate over such a long period of time as this one has had. Hon. Members have disagreed on it from time to time, but the spirit and motivation at every stage have never been political; it has been about trying to make the Bill the best it can possibly be. We have ended up with a process that has seen it get better through all its stages.

Picking up on the comments of the hon. Member for Aberdeen North (Kirsty Blackman) and others, the question of ongoing scrutiny of the regime is an important one. In the pre-legislative scrutiny Committee—the Joint Committee that I chaired—there was a recommendation that there should be a post-legislative scrutiny Committee or a new Joint Committee, perhaps for a limited period. The pre-legislative scrutiny Committee benefited enormously from being a Committee of both Houses. Baroness Kidron has rightly been mentioned by Members today and she is watching us today from the Gallery. She is keeping her scrutiny of the passage of the Bill going from her position of advantage in the Gallery.

We have discussed a number of new technologies during the Bill’s passage that were not discussed at all on Second Reading because they were not live, including the metaverse and large language models. We are reassured that the Bill is futureproof, but we will not know until we come across such things. Ongoing scrutiny of the regime, the codes of practice and Ofcom’s risk registers is more than any one Select Committee can do. The Government have previously spoken favourably of the idea of post-legislative scrutiny, and it would be good if the Minister could say whether that is still under consideration.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend makes a powerful point, echoing the comments of Members on both sides of the House. He is absolutely right that, as well as the scale and character of internet harms, their dynamism is a feature that Governments must take seriously. The problem, it seems to me, is that the pace of technological change, in this area and in others, does not fit easily with the thoroughness of the democratic legislative process; we tend to want to do things at length, because we want to scrutinise them properly, and that takes time. How does my hon. Friend square that in his own mind, and what would he recommend to the Government?

Damian Collins Portrait Damian Collins
- Hansard - -

The length of the process we have gone through on this Bill is a good thing, because we have ended up with probably the most comprehensive legislation in the world. We have a regulator with more power, and more power to sanction, than anywhere else. It is important to get that right.

A lot of the regulation is principle-based. It is about the regulation of user-to-user services, whereby people share things with each other through an intermediary service. Technology will develop, but those principles will underpin a lot of it. There will be specific cases where we need to think about whether the regulatory oversight works in a metaverse environment in which we are dealing with harms created by speech that has no footprint. How do we monitor and scrutinise that?

One of the hardest challenges could be making sure that companies continue to use appropriate technology to identify and mitigate harms on their platforms. The problem we have had with the regime to date is that we have relied on self-reporting from the technology companies on what is or is not possible. Indeed, the debate about end-to-end encryption is another example. The companies are saying that, if they share too much data, there is a danger that it will break encryption, but they will not say what data they gather or how they use it. For example, they will not say how they identify illegal use of their platform. Can they see the messages that people have sent after they have sent them? They will not publicly acknowledge it, and they will not say what data they gather and what triggers they could use to intervene, but the regulator will now have the right to see them. That principle of accountability and the power of the regulator to scrutinise are the two things that make me confident that this will work, but we may need to make amendments because of new things that we have not yet thought about.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

In addition to the idea of annual scrutiny raised by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), does my hon. Friend think it would be a reasonably good idea for the Select Committee on Culture, Media and Sport to set up a Sub-Committee under its Standing Orders to keep any eye on this stuff? My hon. Friend was a great Chairman of that Select Committee, and such a Sub-Committee would allow the annual monitoring of all the things that could go wrong, and it could also try to keep up with the pace of change.

Damian Collins Portrait Damian Collins
- Hansard - -

When I chaired the Digital, Culture, Media and Sport Committee, we set up a Sub-Committee to consider these issues and internet regulation. Of course, the Sub-Committee has the same members. It is up to the Select Committee to determine how it structures itself and spends its time, but there is only so much that any one departmental Select Committee can do among its huge range of other responsibilities. It might be worth thinking about a special Committee, drawing on the powers and knowledge of both Houses, but that is not a matter for the Bill. As my hon. Friend knows, it is a matter of amending the Standing Orders of the House, and the House must decide that it wants to create such a Committee. I think it is something we should consider.

We must make sure that encrypted services have proper transparency and accountability, and we must bring in skilled experts. Members have talked about researcher access to the companies’ data and information, and it cannot be a free-for-all; there has to be a process by which a researcher applies to get privileged access to a company’s information. Indeed, as part of responding to Ofcom’s risk registers, a company could say that allowing researchers access is one of the ways it seeks to ensure safe use of its platform, by seeking the help of others to identify harm.

There is nothing to stop Ofcom appointing many researchers. The Bill gives Ofcom the power to delegate its authority and its powers to outside expert researchers to investigate matters on its behalf. In my view, that would be a good thing for Ofcom to do, because it will not have all the expertise in-house. The power to appoint a skilled person to use the powers of Ofcom exists within the Bill, and Ofcom should say that it intends to use that power widely. I would be grateful if the Minister could confirm that Ofcom has that power in the Bill.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.

The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.

Damian Collins Portrait Damian Collins
- Hansard - -

Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.

The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.

My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.

On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.

My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.

Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.

The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.

The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.

Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.

Damian Collins Portrait Damian Collins
- Hansard - -

As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I need to respond to that, but it goes to show does it not?

My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.

The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.

My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.

Oral Answers to Questions

Damian Collins Excerpts
Thursday 27th April 2023

(1 year, 7 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lucy Frazer Portrait Lucy Frazer
- View Speech - Hansard - - - Excerpts

As the hon. Member will know, the Commissioner for Public Appointments is looking into this matter, and it would not be appropriate to comment until it has published its full report.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

The Folkestone library at Grace Hill in the town was one of the early Carnegie libraries, an important cultural hub as well as a working building. It is currently closed because structural repairs are required. Does the Minister agree that the Arts Council strategy should recognise not only the need to support working library facilities, but that they are often important heritage assets that benefit the whole local community?

Lucy Frazer Portrait Lucy Frazer
- View Speech - Hansard - - - Excerpts

Library facilities are very important, and I was pleased to visit a library facility recently. We have put more funding into libraries and into communities across the country.

Data Protection and Digital Information (No. 2) Bill

Damian Collins Excerpts
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.

In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.

I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.

To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.

There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.

However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.

It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”

The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.

People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.

If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.

This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.

Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend makes an extremely important point. In some ways, we have already seen evidence of that at work: there was a much-talked-about case where Amazon was using an AI system to aid its recruitment for particular roles. The system noticed that men tended to be hired for that role and therefore largely discarded applications from women, because that was what the data had trained it to do. That was clear discrimination.

There are very big companies that have access to a very large amount of data across a series of different platforms. What sort of decisions or presumptions can they make about people based on that data? On insurance, for example, we would want safeguards in place, and I think that users would want to know that safeguards are in place. What does data analysis of the way in which someone plays a game such as Fortnite—where the company is taking data all the time to create new stimuli and prompts to encourage lengthy play and the spending of money on the game—tell us about someone’s attitude towards risk? Someone who is a risk taker might be a bad risk in the eyes of an insurance company. Someone who plays a video game such as Fortnite a lot and sees their insurance premiums affected as a consequence would think, I am sure, that that is a breach of their data rights and something to which they have not given any informed consent. But who has the right to check? It is very difficult for the user to see. That is why I think the system has to be based on the idea that the onus must rest on the companies to demonstrate that what they are doing is ethical and within the law and the established guidelines, and that it is not for individual users always to demonstrate that they have somehow suffered, go through the onerous process of proving how that has been done, and then seek redress at the end. There has to be more up-front responsibility as well.

Finally, competition is also relevant. We need to safeguard against the idea of a walled garden for data meaning that companies that already have massive amounts of data, such as Google, Amazon and Meta, can hang on to what they have, while other companies find it difficult to build up meaningful datasets and working sets. When I was Chairman of the then Digital, Culture, Media and Sport Committee, we considered the way in which Facebook, as it then was, kicked Vine—a short-form video sharing app—off its platform principally because it thought that that app was collecting too much Facebook user data and was a threat to the company. Facebook decided to deny that particular business access to the Facebook platform. [Interruption.] I see that the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully), is nodding in an approving way. I hope that he is saying silently that that is exactly what the Bill will address to ensure that we do not allow companies with big strategic market status to abuse their market power to the detriment of competitive businesses.

Budget Resolutions and Economic Situation

Damian Collins Excerpts
Monday 20th March 2023

(1 year, 8 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

It is, as always, a pleasure to follow the Shadow Secretary of State, although I could not help noticing that she seemed to spend longer making general political points about the Budget than actually addressing issues relating to the Department for Science, Innovation and Technology. [Interruption.] The hon. Lady says “So?” from a sedentary position, and she is perfectly entitled to do so—the Budget debate is a general debate during which Members can bring up topics relating to any subject, not just the one that is slated for the day—but I mention that because I think this is an area of Government activity in which the Government have an incredibly strong record over many years.

That is demonstrated by the fact that investors and businesses recognise the UK as a global hub—a leading centre in Europe and in the world. When we talk to tech investors working in hubs in Berlin or Barcelona or Tel Aviv, we hear that they regard London as the primary centre where they go to raise funds to grow and scale their businesses. As the Secretary of State said, we have the leading research institutions in the world: four of the world’s top 10 universities are based here. Our university clusters are driving innovation and growth in the sector, which is why we are so well regarded and respected. Our strategy for making the UK a world centre for tech and innovation is based on three key areas: driving growth in the economy, having a pro-competition strategy, and setting high standards.

When it comes to growth in the tech sector, we should look at investment not just in London and the south-east but across the UK, and at the way in which tech sectors have emerged and developed over the last decade. A good demonstration of that, as the shadow Secretary of State knows, is that we can jump on the Metro or the tram in Manchester and see the emergence of Salford Quays as one of Europe’s leading centres for creative industries and technology. When I was Chairman of the Select Committee, we visited Dundee. Dundee and Edinburgh are leading centres for the video games industry.

In Birmingham, within a stone’s throw of where the Birmingham hub for high-speed rail will be, we can see institutions such as Birmingham City University with its fantastic STEAMhouse centre for tech skills, where AI training courses are being delivered. That is over the road from the Greater Birmingham and Solihull Institute of Technology, a centre for advanced engineering, which is down the road from Fazeley Studios, which has become an important hub for the broadcasting and creative industries in Birmingham. Many of the buildings on those sites did not exist a decade ago, and the idea that this would be a major cluster for the tech sector and the broadcasting and creative industries was not something that people would have envisaged in 2010, but it is a reality today as a consequence of policies that have been put in place by this Government. That is why the Chancellor was right to recognise in the Budget the strategic significance of investment in research and development, and also in the strategic hubs and clusters of businesses that are so important for driving the sector.

The UK will be a leader in digital competition, and that is one of the reasons we need to support British businesses throughout their life cycle; not just in the R&D phase when they are growing, but when they seek to scale as well. If emerging businesses are to scale in tech marketplaces, we need to ensure that they can compete fairly alongside the tech giants whose services they rely on to reach their customers. Many app developers cannot reach their customers without using products and platforms designed by Meta. Most businesses require Amazon services either for cloud storage or for selling. Most businesses also require a good ranking on Google to reach their customers. They should be able to do so fairly. There are only two app store markets: Google and Apple. Those two monopolies exist alongside each other. Any developer needs to use those stores to reach customers, just as any customer needs to use them to access the products they want. It is important that customers and businesses are treated fairly, and the digital markets, competition and consumer Bill, which will come forward soon, will be vital to securing that.

Standards are one of the most important aspects of the UK’s leadership. One of the best examples of standards, certainly for AI, is the Online Safety Bill. It is world-leading legislation that will effectively cover the regulation of the AI-driven recommendation tools that drive the experience of social media. AI is an enabling technology. It draws on data to make recommendations and decisions on behalf of users to improve that user experience. However, like any other form of technology, it requires the right standards and safety regulations around it to ensure that it is delivering. New chat tools have been mentioned. AI-driven chat boxes are new in their technology, but the principles behind them are not new. We have also seen that with technologies such as autocomplete on Google, where online tools guess and make assumptions about what people want to see or what responses they want. We need to ensure that they are making sensible, reasonable recommendations and not directing people towards harmful content or hate speech, or driving people into isolated groups and communities.

There need to be standards that underpin the way that AI works and the recommendations it makes to users who engage with those tools. That is why the Government were right to recommend and support the creation of a UK AI sandbox, where companies can trial, and demonstrate trials of, new technologies before they are rolled out. This is common and standard in most other industries. The European Union is developing an AI sandbox, and it is right that we should do the same here. It is also right that we should build on the work of AI standards being developed at the Alan Turing Institute, to set a standards-based framework for the applications of AI in the future.

Science and Technology Framework

Damian Collins Excerpts
Tuesday 7th March 2023

(1 year, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Michelle Donelan Portrait Michelle Donelan
- View Speech - Hansard - - - Excerpts

Indeed. When we talk about our science and technology agenda, this is not just to support big tech; it is to support all businesses, including those small and medium-sized ones, which we hope to be able to support to scale up and continue to grow and create jobs. At the heart of our plans, the hon. Gentleman will see how we can support them in a range of different ways through the 10-point plan and by being strategic across Government, from our approach on skills to our approach on regulation. And let us not forget that this Department is coming forward with a number of pieces of legislation, including the Data Protection and Digital Information Bill, which will help to support businesses to get rid of some of that unnecessary burden, and the digital markets Bill, which is focused on freeing up some of those small businesses and unlocking opportunities for growth.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I welcome the framework and also my right hon. Friend’s commitment that the Government will soon be publishing the national semiconductor strategy. Does she agree that this is a fantastic opportunity to highlight not just the leading role in the world that British companies play in semiconductor design, but the attractiveness of the UK for investment in advanced manufacturing, particularly in compound semiconductors?