A campaign to make online advertising work better and safer.
We have asked regulators to investigate the flaws at the heart of the online "behavioral" advertising auction system, which the IAB and Google both play the key roles in orchestrating.
Every time a person visits a website and is shown a “behavioural” ad on a website, intimate personal data that describes each visitor, and what they are watching online, is broadcast to tens or hundreds of companies. Advertising technology companies broadcast these data widely in order to solicit potential advertisers’ bids for the attention of the specific individual visiting the website.
This broadcast, known as an “bid request” in the online industry, fails to protect these intimate data against unauthorized access. This would not be a problem if the bid request did not contain personal data. Unfortunately, the IAB "strongly recommends" that they do. In fact, bid requests can contain highly sensitive personal data.
Under the GDPR this is unlawful. The GDPR, Article 5, paragraph 1, point f, requires that personal data be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss.” If you can not protect data in this way, then the GDPR says you can not process the data.
Hundreds of billions of bid requests are broadcast every day. This means that personal data about the average person on the Web is almost certainly regularly broadcast to companies that can examine what he or she reads, watches, and listens to online.
Harms of the status quo
We are all exposed by hundreds of billions of bid requests a day to companies, governments, and cold callers.1 They know what we read, watch, and listen to online. We can not know where our personal data end up once they go out in a bid request.
The publishers we rely on for news and entertainment suffer. Their audiences are commodified and resold by the bottom of the web.
Advertisers, who we rely on to subsidise publishers through advertising, are defrauded by business partners in an opaque system, and by criminals running adfraud bots that thrive on the use of personal data in the ad auction system. Advertisers are also exposed to unwanted legal hazard.
In short, the broadcast of personal data in real-time bidding hurts worthy publishers, and enables a business model for untrustworthy sites, and enables the profiling of every single online person, and steals marketers money, and exposes them to risk.
This can be easily fixed, if we demand it. RTB needs reform.
We want to make online advertising work better and safer. There is a technical way to make RTB operate safely, which we suggest: to remove or truncate personal data, especially sensitive or highly identifying personal data, within a bid request that the IAB and Google standards prescribe or even ‘strongly recommend’. This is an essential tweak. If a system has insecurity at its core, regulators need to understand and assess how its core could be changed to make it compliant, not to try to add polish to a deeply flawed system.
Who is involved
The complainants are being made by Katarzyna Szymielewicz, President of the Panoptykon Foundation, Jim Killock, Executive Director of the Open Rights Group, Michael Veale of University College London, and Dr Johnny Ryan of Brave, the private web browser. The complainants in Ireland and in the UK have instructed Ravi Naik, Parnter at ITN Solicitors.
We are interested in hearing from privacy and consumer protection organizations in other EU Member States that are interested in collaborating to fix adtech. Please do contact us.
1. Data Brokers: a call for transparency and accountability, Federal Trade Commission, May 2004 (https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf) pp 39-40.