UK Advertising in a Digital Age (Communications Committee Report) Debate

Full Debate: Read Full Debate

Baroness Kidron

Main Page: Baroness Kidron (Crossbench - Life peer)

UK Advertising in a Digital Age (Communications Committee Report)

Baroness Kidron Excerpts
Thursday 25th April 2019

(5 years, 7 months ago)

Lords Chamber
Read Full debate Read Hansard Text
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

The noble Lord, Lord Gilbert, has set out very admirably the findings of our report, particularly our concerns about the opaque nature of advertising and the lack of diversity in the industry. There is also a palpable concern that current education policy does not meet the needs of the creative industries and that the Department for Education is perhaps not in listening mode; and that the looming spectre of Brexit threatens London’s status as the premier hub, and first-choice location, of such a lucrative industry.

Rather than go over the ground that has been so admirably covered, I would like to pick up on one key issue that I believe would have been in the report if it were being published now and is clearly on the minds of other noble Lords; it became a considerable concern as we went on to our second report, Regulating the Internet. I refer to the monetisation, commercialisation and commodification of children online. In doing so, I refer noble Lords to my interests as listed in the register.

Children make up one-third of all online users globally and are therefore, like all users, subject to the business models of the world’s most popular online platforms, much of whose value comes from the commoditisation of data. That business model is to harvest from users as much personal data as possible, then use that data to encourage them into behaviours and decisions likely to generate profit—that is, to advertise, market or otherwise make the user available to those who wish to have their attention. As a result, children are bombarded with targeted advertising, irrespective of whether it is in their best interests. The platforms they use are designed to keep them online for as long as possible, even to the point of addiction. This is why the majority of companies that provide online services are incentivised not to care if their users are underage. If a user creates data, they create value; if they create value, then they are old enough.

Since this goes far beyond what we traditionally understand as advertising, it is perhaps useful to consider how it plays out in practice. I am uncertain, although I will take a guess, that not many of our small number have played Pokémon GO, a game that takes players out of the house to locate and collect virtual creatures in real-world places. But the chances are that even those who have played it—children included—do not know that the game’s real prize is not the collection of virtual creatures but, rather, the sale of the user’s location data to companies willing to pay.

The commercial arrangement between Pokémon GO maker Niantic and McDonald’s is the most prominent example of this. McDonald’s pays Niantic to place virtual Pokémon in its car parks and restaurants, thereby directing droves of oblivious children towards Big Macs, fries and chicken nuggets just as the game is finished. If this was an outlier, it would still be an affront, but targeting children is a growing norm.

In 2017, a leaked Facebook memo produced shocked outrage when it revealed that the company had given a presentation to advertisers demonstrating its ability to infer emotional states, in real time, from the posts and photos of millions of children, determining when they are feeling “stressed”, “nervous”, “overwhelmed”, “anxious” or “useless”. In other words, it was targeting children with advertising when they were at their most vulnerable.

This sort of profiling and targeting is a new frontier—not advertising as we once understood it, but using a child’s emotional state to help predict and shape their behaviour and then nudging them at the point they are most likely to respond. In more straightforward language, it is making them available to advertisers and marketeers at the precise moment that they are most vulnerable to the push of that commercial interest. This is not a fair fight.

Even if children are feeling their best, they are still vulnerable. Research on children’s cognitive development vis-à-vis advertising shows again and again that they are unable to spot native advertising—that is, advertising that adopts the look, feel and function of the media format in which it appears; it is designed to be indistinguishable from, and therefore to undermine, other content such as facts or news. No wonder Ofcom finds that only a fifth of eight to 11 year-olds and a third of 12 to 15 year-olds can differentiate between promotional and factual content, understand that prominent search results have probably been paid for, or identify and resist the nudge towards in-app purchase. The committee’s report correctly identifies that,

“many businesses exploit users’ data without informed consent”.

We must surely also ask whether it is appropriate to seek the consent of a child to treat them in any of these ways. Profiling, manipulating and targeting children is wrong in principle and harmful in practice.

The age-appropriate design code, launched in draft last week by the Information Commissioner, offers a new approach. It states that a child’s data must be processed only in circumstances where they are actively and knowingly engaged, and for purposes in their own best interests. This children’s code, as it is now nicknamed, will require online services—including the advertising sector—to reconsider how they treat children online by making them observe the norms and protections of childhood, including protecting children from economic pressure and exploitation. The code’s 16 provisions cover a number of interconnected aspects of data protection, such as high-privacy default settings, preventing sites recommending material detrimental to children’s health and well-being and strategies to minimise the gathering of data—since the very best way of avoiding abuse of a child’s data is not to take it in the first place. The code also covers data sharing, security of connected toys and the promotion of commercial activities that fail the bar of being in the best interests of the child. Its 16 provisions effectively take children out of the excesses of the business model.

Since its publication on 15 April, I have been asked repeatedly if I think it is reasonable to demand companies reduce their potential profit by preventing the commercial exploitation of children’s data. I do not understand the question. It is not desirable, safe or in line with the norms of our society to suggest that a 13 year-old—or the many millions of even younger children who access commercial sites—be asked to manage the complexities and intrusive demands of a world dominated by the interests of online advertising behemoths, not least when our report quotes expert after expert describing the digital advertising market as “highly opaque”, “murky” and “fraudulent”. Under these circumstances, one must ask what chance children or young people have to protect themselves, and come to the obvious conclusion that they have none.

There is nothing intrinsically wrong with the technology we are all using. On the contrary, within it lies the promise of a better and more equitable world. However, a greedy corporate culture has been allowed to develop and until now the sector has been given a free pass for the collateral damage of its model, including the monetisation, commercialisation and commodification of childhood. Rather than questioning whether businesses should protect their bottom line, we must reassert that protecting children should be everyone’s bottom line.

So does the Minister agree with me that innovation that does not include protecting the well-being of children is not worthy of the name, and that businesses in the sector, big and small, must put the best interests of children first when designing their products and services? Can he also confirm that the Government will stand firm behind the Information Commissioner, whose children’s code is much admired around the world as the first serious attempt to tackle the asymmetry of power between the tech sector and children, and resist the attempts of the commercial interests working furiously in the background to water it down?

Finally, if advertising now includes the ability to take a child out of their bedroom, out of their home and across town to a McDonald’s car park without their knowledge, understanding or informed consent, does the Minister agree with me that it is now time for society to formally uphold all the privileges, protections and legal frameworks that have defined childhood so far, irrespective of the nature of the service, who is paying for access to that child or where the owner is registered?

--- Later in debate ---
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - - - Excerpts

My Lords, I must congratulate the noble Lord, Lord Bilimoria, on doing a bit of personal advertising for his product—but not online, of course.

I am most grateful to the noble Lord, Lord Gilbert, for presenting this report at last. A year is a long time and there have been multiple postponements when other activity, which seems to me more questionable, has dominated the procedures and business of this House. This debate comes at what I consider to be a difficult time: it has been a year, and much has happened to supersede the report. Having heard the business of the House for the coming two weeks only recently, it also comes just a few days before we discuss the online harms report, where much of this material will be rehearsed all over again. We must try to do honour to this report and thank those on the committee who produced it, noting of course that between then and now—it was 11 April last year when the report appeared—many things have happened. I will cherry pick from a bunch of bullet points that I have pulled together on things that have happened in the meantime.

First, the report looked forward to GDPR; that has now happened and we can evaluate it as we will. My noble friend Lord Gordon recommended a doorstopper book; I have something a little thinner than that, published last September. It is Martin Moore’s Democracy Hacked, which ought to be recommended reading for anybody taking part in these debates. I will willingly contribute to a fund to make it available for members of the committee and others who are interested. It is a systematic, scientific look at all the questions that have been raised in various parts of the House.

Then we got the Plum report, if I may call it that, which is included in the admirable briefing package that came from the Library. That report was commissioned by the DCMS and is called Online Advertising in the UK. It updates and extrapolates information, and presents it beyond the scope of the original report. The noble Lord, Lord Bilimoria, has just referred to the Cambridge Analytica scandal, which I presume came too late for inclusion in the report—in detail, anyway—and then there was the online harms White Paper itself. The noble Baroness, Lady Kidron, talked about the age-verification material that is becoming available, although I hope she can reassure us regarding a report in a newspaper that said:

“Age-related curbs on porn circumvented in minutes”—

Baroness Kidron Portrait Baroness Kidron
- Hansard - -

The code that I referred to is the age-appropriate design code. That is about data protection and separate from the age verification of pornography, just for the record.

Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port
- Hansard - - - Excerpts

Thank you. In this area, I need correcting quite a lot so I welcome that intervention. For all that, the concern for children is picked up in the online harms White Paper. Certain harms identified when they affect children are stipulated in the list of harms there.

I cannot forbear mentioning this morning’s news about Facebook. If it can put £5 billion aside to pay for the infractions which have occurred in its activities, and if for all that £5 billion it can report a 26% rise in its profits, we simply have to ask: are we in waters that are too deep for us to swim in? There are contradictory elements happening that I find very threatening and bewildering.

I want to go back for a moment. I have compared the list of harms on page 31 of the online harms White Paper—I will do it more systematically before the debate next week on that paper—with the harms, hinted at by the noble Lord, Lord Bilimoria, listed in the Plum report: individual, societal and economic. There are so many harms identified in the Plum report that do not figure at all in the list in the DCMS online harms report. We had a Question for Short Debate on this when the paper was published and I was on a wave of euphoria, because after all that Brexit stuff we were talking about real things again. I really was flying but afterwards the noble Baroness, Lady Neville-Rolfe, said to me, “But there’s nothing about online harms for small and medium enterprises”. Then the noble Baroness, Lady O’Neill, came to me and said, “But there’s nothing in there about the harms for our democracy”. In the end, the paper has to be more generic and overwhelmingly across the spectrum than it currently is.

Let us look at the harms in the Plum report. There is,

“Digital advertising fraud … brand risk”,


and,

“Inappropriate advertising … that is … offensive, explicit … or … contains malware”.


Under “Societal harms” there is,

“financial support for publishers of offensive or harmful content”.

There is also “Discrimination”, described as targeted data to inadvertently categorise people on gender, ethnicity and race.

There is a moment of confession coming up—wait for it. Every morning, I generally address the quick crossword from the Guardian newspaper, and if I do it very quickly I allow myself to do just a couple of exercises in solitaire. That really is a confession; I am trying to avoid addiction, and coming off it cold turkey is very difficult. But when I put those things on the screen, along with the crossword; a very expensive car is advertised to me. I am a retired Methodist minister and when I came to this House, I came in my rusty Ford Fiesta. On what grounds of behavioural knowledge and profiling of me are they targeting me with a Bentley? When I come to solitaire, however, what do I find? It is ladies’ clothing. What in my life do they know that I would not want to share with Members of this House, for goodness’ sake? Is that clothing for my wife or some other woman, or for myself, if they think that I am interested in these garments? The more alarming thing still is that I have repeatedly allowed myself to press all the buttons that eliminate the advert from the screen, but no algorithm has yet picked up the fact that I am not interested in advertising. If it is behaviourally driven it should, but it has not; I still get the stuff anyway.

When Martin Moore wrote his book, he took us through all the stages that have produced this side of the internet. We must agree with the right reverend Prelate the Bishop of Durham and others who laud the democratic and communautaire aspects of the internet—what it makes possible for us. At the same time, I fear that the negative aspects—the underbelly or darknet—is becoming disproportionately controlling of the general aspects of this technology. One review said that just before Facebook went to the stock market in 2012—after starting by saying that it did not want any advertising when it first launched in 1998—according to Martin Moore it went,

“‘all out to create an intelligent, scalable, global, targeted advertising machine’ that gave advertisers granular access to users. And so it created the most efficient delivery system for targeted political propaganda the world had ever seen”.

I will read one final paragraph from the review of this remarkable book, because it points me to both what the internet can do and what is too often implicit in the very things it does well. If I read it, your Lordships will get the tale. It says:

“Actually, Google is already doing a very good job”,


at helping in the field of education. It continues that:

“By mid-2017, the majority of schoolchildren in America were using Google’s education apps, which of course track the activity of every child, creating a store of data that—who knows?—might come in useful when those children grow up to be attractive targets for advertising. In the near future, Moore points out, we might no longer have a choice: ‘It will be a brave parent who chooses to opt out of a data-driven system, if by opting out it means their child has less chance of gaining entry to the college of their choice, or of entering the career’”,


they aspire to. You are in because it is good for you, but being in makes you vulnerable to exploitation in the fullness of time. This is a matter about which we all must be concerned, because it affects us in incalculable ways. Even a Minister of Government as expert as the one facing me now will have to bow to the inevitable, as we stiffen our resolve to face this question head on and do something about it.