Dating and fertility apps among those snitching to “out of control” adtech, report finds

The latest report to warn that surveillance capitalism is out of control — and ‘free’ digital services can in fact be very costly to people’s privacy and rights — comes courtesy of the Norwegian Consumer Council which has published an analysis of how popular apps are sharing user data with the behavioral ad industry.

It suggests smartphone users have little hope of escaping adtech’s pervasive profiling machinery — short of not using a smartphone at all.

A majority of the apps that were tested for the report were found to transmit data to “unexpected third parties” — with users not being clearly informed about who was getting their information and what they were doing with it. Most of the apps also did not provide any meaningful options or on-board settings for users to prevent or reduce the sharing of data with third parties.

“The evidence keeps mounting against the commercial surveillance systems at the heart of online advertising,” the Council writes, dubbing the current situation “completely out of control, harming consumers, societies, and businesses”, and calling for curbs to prevalent practices in which app users’ personal data is broadcast and spread “with few restraints”. 

“The multitude of violations of fundamental rights are happening at a rate of billions of times per second, all in the name of profiling and targeting advertising. It is time for a serious debate about whether the surveillance-driven advertising systems that have taken over the internet, and which are economic drivers of misinformation online, is a fair trade-off for the possibility of showing slightly more relevant ads.

“The comprehensive digital surveillance happening across the adtech industry may lead to harm to both individuals, to trust in the digital economy, and to democratic institutions,” it also warns.

In the report app users’ data is documented being shared with tech giants such as Facebook, Google and Twitter — which operate their own mobile ad platforms and/or other key infrastructure related to the collection and sharing of smartphone users’ data for ad targeting purposes — but also with scores of other faceless entities that the average consumer is unlikely to have heard of.

The Council commissioned a data flow analysis of ten popular apps running on Google’s Android smartphone platform — generating a snapshot of the privacy blackhole that mobile users inexorably tumble into when they try to go about their digital business, despite the existence (in Europe) of a legal framework that’s supposed to protect people by giving citizens a swathe of rights over their personal data.

Among the findings are a make-up filter app sharing the precise GPS coordinates of its users; ovulation-, period- and mood-tracking apps sharing users’ intimate personal data with Facebook and Google (among others); dating apps exchanging user data with each other, and also sharing with third parties sensitive user info like individuals’ sexual preferences (and real-time device specific tells such as sensor data from the gyroscope… ); and a games app for young children that was found to contain 25 embedded SDKs and which shared the Android Advertising ID of a test device with eight third parties.

The ten apps whose data flows were analyzed for the report are the dating apps Grindr, Happn, OkCupid, and Tinder; fertility/period tracker apps Clue and MyDays; makeup app Perfect365; religious app Muslim: Qibla Finder; children’s app My Talking Tom 2; and the keyboard app Wave Keyboard.

“Altogether, Mnemonic [the company which the Council commissioned to conduct the technical analysis] observed data transmissions from the apps to 216 different domains belonging to a large number of companies. Based on their analysis of the apps and data transmissions, they have identified at least 135 companies related to advertising. One app, Perfect365, was observed communicating with at least 72 different such companies,” the report notes.

“Because of the scope of tests, size of the third parties that were observed receiving data, and popularity of the apps, we regard the findings from these tests to be representative of widespread practices in the adtech industry,” it adds.

Aside from the usual suspect (ad)tech giants, less well-known entities seen receiving user data include location data brokers Fysical, Fluxloop, Placer, Places/Fouraquare, Safegraph and Unacast; behavioral ad targeting players like Receptiv/Verve, Neura, Braze and LeanPlum; mobile app marketing analytics firms like AppsFlyer; and ad platforms and exchanges like AdColony, AT&T’s AppNexus, Bucksense, OpenX, PubNative, Smaato and Vungle.

In the report the Forbrukerrådet concludes that the pervasive tracking of smartphone users which underpins the behavioral ad industry is all but impossible for smartphone users to escape — even if they are able to locate an on-device setting to opt out of behavioral ads.

This is because multiple identifiers are being attached to them and their devices, and also because of frequent sharing/syncing of identifiers by adtech players across the industry. (It also points out that on the Android platform a setting where users can opt-out of behavioral ads does not actually obscure the identifier — meaning users have to take it on trust that adtech entities won’t just ignore their request and track them anyway.)

The Council argues its findings suggest widespread breaches of Europe’s General Data Protection Regulation (GDPR), given that key principles of that pan-EU framework — such as data protection by design and default — are in stark conflict with the systematic, pervasive background profiling of app users it found (apps were, for instance, found sharing personal data by default, requiring users to actively seek out an obscure device setting to try to prevent being profiled).

“The extent of tracking and complexity of the adtech industry is incomprehensible to consumers, meaning that individuals cannot make informed choices about how their personal data is collected, shared and used. Consequently, the massive commercial surveillance going on throughout the adtech industry is systematically at odds with our fundamental rights and freedoms,” it also argues.

Where (user) consent is being relied upon as a legal basis to process personal data the standard required by GDPR states it must be informed, freely given and specific.

But the Council’s analysis of the apps found them sorely lacking on that front.

“In the cases described in this report, none of the apps or third parties appear to fulfil the legal conditions for collecting valid consent,” it writes. “Data subjects are not informed of how their personal data is shared and used in a clear and understandable way, and there are no granular choices regarding use of data that is not necessary for the functionality of the consumer-facing services.”

It also dismisses another possible legal base — known as legitimate interests — arguing app users “cannot have a reasonable expectation for the amount of data sharing and the variety of purposes their personal data is used for in these cases”.

The report points out that other forms of digital advertising (such as contextual advertising) which do not rely on third parties processing personal data are available — arguing that further undermines any adtech industry claims of ‘legitimate interests’ as a valid base for helping themselves to smartphone users’ data.

“The large amount of personal data being sent to a variety of third parties, who all have their own purposes and policies for data processing, constitutes a widespread violation of data subjects’ privacy,” the Council argues. “Even if advertising is necessary to provide services free of charge, these violations of privacy are not strictly necessary in order to provide digital ads. Consequently, it seems unlikely that the legitimate interests that these companies may claim to have can be demonstrated to override the fundamental rights and freedoms of the data subject.”

The suggestion, therefore, is that “a large number of third parties that collect consumer data for purposes such as behavioural profiling, targeted advertising and real-time bidding, are in breach of the General Data Protection Regulation”.

The report also discussing the harms attached to such widespread violation of privacy — pointing out risks such as discrimination and manipulation of vulnerable individuals, as well as chilling effects on speech, added fuel for ad fraud and the torching of trust in the digital economy, among other society-afflicting ill being fuelled by adtech’s obsession with profiling everyone…

Some of the harm of this data exploitation stems from significant knowledge and power asymmetries that render consumers powerless. The overarching lack of transparency of the system makes consumers vulnerable to manipulation, particularly when unknown companies know almost everything about the individual consumer. However, even if regular consumers had comprehensive knowledge of the technologies and systems driving the adtech industry, there would still be very limited ways to stop or control the data exploitation.

Since the number and complexity of actors involved in digital marketing is staggering, consumers have no meaningful ways to resist or otherwise protect themselves from the effects of profiling. These effects include different forms of discrimination and exclusion, data being used for new and unknowable purposes, widespread fraud, and the chilling effects of massive commercial surveillance systems. In the long run, these issues are also contributing to the erosion of trust in the digital industry, which may have serious consequences for the digital economy.

To shift what it dubs the “significant power imbalance between consumers and third party companies”, the Council calls for an end to the current practices of “extensive tracking and profiling” — either by companies changing their practices to “respect consumers’ rights”, or — where they won’t — urging national regulators and enforcement authorities to “take active enforcement measures, to establish legal precedent to protect consumers against the illegal exploitation of personal data”.

It’s fair to day that enforcement of GDPR remains a work in progress at this stage, some 20 months after the regulation came into force, back in May 2018. With scores of cross-border complaints yet to culminate in a decision (though there have been a couple of interesting adtech– and consent-related enforcements in France).

We reached out to Ireland’s Data Protection Commission (DPC) and the UK’s Information Commissioner’s Office (ICO) for comment on the Council’s report. The Irish regulator has multiple investigations ongoing into various aspects of adtech and tech giants’ handling of online privacy, including a probe related to security concerns attached to Google’s ad exchange and the real-time bidding process which features in some programmatic advertising. It has previously suggested the first decisions from its hefty backlog of GDPR complaints will be coming early this year. But at the time of writing the DPC had not responded to our request for comment on the report.

A spokeswoman for the ICO — which last year put out its own warnings to the behavioral advertising industry, urging it to change its practices — sent us this statement, attributed to Simon McDougall, its executive director for technology and innovation, in which he says the regulator has been prioritizing engaging with the adtech industry over its use of personal data and has called for change itself — but which does not once mention the word ‘enforcement’…

Over the past year we have prioritised engagement with the adtech industry on the use of personal data in programmatic advertising and real-time bidding.

Along the way we have seen increased debate and discussion, including reports like these, which factor into our approach where appropriate. We have also seen a general acknowledgment that things can’t continue as they have been.

Our 2019 update report into adtech highlights our concerns, and our revised guidance on the use of cookies gives greater clarity over what good looks like in this area.

Whilst industry has welcomed our report and recognises change is needed, there remains much more to be done to address the issues. Our engagement has substantiated many of the concerns we raised and, at the same time, we have also made some real progress.

Throughout the last year we have been clear that if change does not happen we would consider taking action. We will be saying more about our next steps soon – but as is the case with all of our powers, any future action will be proportionate and risk-based.

Read More