Google companies

Companies in the UK use users’ personal data to place billboards

According to a news investigation by Big Brother Watch, a London-based civil liberties group known for dealing with issues of public surveillance.

The report details how personalized ads – a phenomenon that has more than once raised privacy concerns over digital espionage – are no longer confined to our private feeds, but have instead begun to overflow into our public lives.

“We’ve discovered new ways to track the movements and behaviors of millions of people to target us with ads on the streets, resulting in some of the most intrusive ad surveillance we’ve ever seen in the UK” , Jake Hurfurt, head of research and investigations at Big Brother Watch, said in a press release about the analysis.

The report identifies several companies that pioneered the introduction of face detection technology in different cities across the country. Unlike traditional paper billboards whose advertisements are printed on vinyl, digital billboards can be programmed to deliver more than one message. Many of them also have high definition cameras to observe the unsuspecting public. Algorithms then attempt to detect a person’s face, physical features, and even what they might be wearing to tailor ads to people walking down the street, in malls, and even on tablets on the back of cars. cars.

ALFI, an American ad-tech developer, already has many of these facial recognition tablets in various Lyft and Uber vehicles in the United States. The company claims to use artificial intelligence and machine learning algorithms to analyze how their audience interacts with ads and show them more relevant ads. Now, more than ever, everything is a camera and every camera is a computer.

The report also notes that two influential UK billboard owners, Ocean Outdoor and Clear Channel, rely on facial detection tools made by the French company. Quividi. The company says its technology is able to scan up to 100 faces at once and detect how long a person has been standing nearby or paying attention to an advertisement. It also attempts to discern factors such as age, gender and mood – abilities that have been heavily contested and debunked by machine learning experts.

The report notes that this data, combined with crowd size and attention information, can be used to trigger changes that target large-scale audiences.

While it’s one thing to recognize that predictive analytics can control what we see and interact with from the comfort of our own home, it’s another to realize that you and the people around you are collectively influenced. Arvind Narayanan, professor of computer science at Princeton University, says one of the main problems with companies using data-collecting technologies to personalize billboards is that they “erode the idea of ​​spaces public”.

“It’s hard to have spontaneous, informal social interactions with strangers when you’re watching content that’s aimed at you and you know you’re being watched,” Narayanan told Motherboard via email. “These technologies achieve the feat of simultaneously damaging our privacy and our sense of community.”

“Quividi software relies on face detection, not facial recognition,” a Quividi spokesperson told Motherboard. “These are two different technologies. Face detection only looks for the presence of a face while face recognition looks for and identifies a specific person.

“This means that the Quividi software cannot recognize an individual, neither in absolute terms (complete identity) nor in terms of repeated exposures (e.g. recognizing that someone was at a sequence of different places or visited the same place twice)”, a spokesperson for Quividi. said.

Targeted advertising is virtually inevitable for anyone with a smartphone or computer, but some experts say the pressure to use our privacy against us didn’t start with the advent of the internet or AI; in fact, the concept has been closely linked to capitalism for more than a century.

“The whole point of surveillance advertising or digital advertising is to change our behaviors in certain ways or change our attitudes in certain ways,” said Matthew Crain, associate professor of media and communication at the University of Miami. , at Motherboard. As presumptuous as it sounds, Crain says, the more information a brand has about its potential audience, the less money it wastes sending ads to people or groups outside of its target market.

The way companies access our data is both sinister and surprisingly mundane; the report notes that companies use data-tracking apps and the vague language of their privacy policies to obtain “consent” from users to collect large amounts of data to generate advertising profiles. These individual identifiers can include aspects of how users interact with personal apps to the stores they frequent most, and this secret fusion of our likes and dislikes is then sold to data analytics companies. to use them indefinitely.

The survey also revealed that the profiles of certain interest groups are linked to GPS tracking data that allows brands to target people based on where and when they are likely to be that day, creating near real-time advertisements. The report specifically calls Adsquare, a German ad tech company that has “pioneered” this phone-to-billboard strategy, as 1 in 10 mobile devices in the UK have trackers that send personal data back to it. This means that there are at least 8 million phones that could be sending location and behavior data to Adsquare at any given time.

But these frightening and effective advances are only limited to the UK; evidence of this continued practice has already been seen in the United States and elsewhere in the world. For example, although Adsquare claims to comply with privacy laws regarding the use of these tracking tools, one of their data brokers includes the controversial company X-Mode, now known as Outlogic, was banned by Apple and Google app stores in 2020 for selling data to the US military.

Hurfurt said the only way to force data collectors to respect people’s privacy and give them real choices is through sweeping and transparent tech sector reforms.

Steven Feldstein, a senior fellow at the Carnegie Endowment for International Peace, agrees with that sentiment. “As far as oversight goes, there’s been quite a significant regulatory backlog when it comes to getting the right rules and laws in place to regulate these industries,” Feldstein told Motherboard. “There is a real gap in terms of regulation to catch up with the practice and ensure that the privacy needs of individuals are protected.”

Similar examples of companies abusing advertising data have since inspired public policy actors in the United States to speak out, promote legislation which would prohibit ad networks from using personal data as well as data based on protected class information, such as race, gender and religion to target illegal advertisements.

“It’s not that bad tracking has no place in the digital ecosystem, but that right now it’s so unbalanced in one direction,” Feldman says. “There’s so little accountability, and there’s so little transparency about how it’s being used, and so little consent protection, that it’s really out of whack, and I think it’s causing damage. disturbing accordingly.”

This article is part of State of Surveillance, made possible by a grant from Columbia University’s Ira A. Lipman Center for Journalism and Civil and Human Rights in conjunction with Arnold Ventures. The series will explore the development, deployment and effects of surveillance and its intersection with race and civil rights.