This note examines a Belgian court ruling against Facebook’s tracking and approach to consent. Facebook and adtech companies should expect tough sanctions when they find themselves before European courts – unless they change their current approach to data protection and the GDPR.
Facebook is playing a dangerous game of “chicken” with the regulators. First, it has begun to confront users in the EU with a new “terms of service” dialogue, which denies access to Facebook until a user opt-ins to tracking for ad targeting, and various other data processing purposes. (more detail in footnote 1)
This dialogue appears to breach several important principles of the GDPR, including the principle of purpose limitation, freely given, non-conditional consent, and of transparency. In other words, if Facebook attempts to collect consent in this manner, that consent will be unlawful. European Regulators have been very clear on this point.
Second, on 1 May 2018, a mere twenty four days before the application date of the GDPR, Facebook’s head of privacy announced plans to build “Clear History”, a feature with which users can opt-out of Facebook collecting data about their visits to other websites and apps. But the GDPR demands not an opt-out, but an opt-in. Nor is Clear History available to non-Facebook users. And as a further sign of Facebook’s brinksmanship, it said “it will take a few months to build Clear History”, which means that the feature will not be available to users until long after the GDPR has been applied later this month.
Facebook’s approach puts it on a collision course with European courts. This note examines one recent decision in which the Brussels Court of First Instance ruled that Facebook’s tracking of people on other websites is illegal, and that its approach to consent is invalid. The immediate result was a financial penalty, and an order that Facebook must submit to having an independent expert supervise its deletion of all the personal data it illegally amassed.
The implications of the ruling are far wider. It is an insight into the hazard for digital publishers and adtech vendors of failing to heed the warnings of the Article 29 Working Party.
Important lessons for RTB/programmatic
Belgium’s data protection authority, the Belgian Privacy Commission, challenged that Facebook’s “Like” buttons and trackers on websites all over the web enable it to look “over the shoulders of persons while they are browsing from one website to the next … without sufficiently informing the relevant parties and obtaining their valid consent”.
The Court agreed, and summarized Facebook’s tracking in its ruling:
When someone visits a website with such a Facebook social plug-in, his browser will automatically establish a connection with (by sending an http request to) the Facebook server, after which the visitor’s browser directly loads the “plug-in” function from the Facebook server.
The Court’s ruling outlined what data are received by Facebook from its social plugins installed on other websites:
1. IP address;
2. URL of the page of the website requested by the user;
3. The browser management system;
4. The type of browser, and
5. the cookies (previously) placed by the third-party website from which the browser requests the this-party content.
In a previous judgement in 2015 the Court observed that these browsing data are “frequently of a very sensitive nature, allowing, for example, health-related, sexual and political preferences to be gauged”.
This should give pause to digital publishers and adtech vendors, because these data, which reveal special categories of personal data, are exactly the same data that websites routinely broadcast to tens – if not hundreds – of companies in RTB bid requests. This happens every time an advertisement is served.
The Court noted that the scale of Facebook’s presence across the web makes this tracking “practically unavoidable”. The February 2018 ruling reiterated the Court’s previous ruling in 2015 that “the extent of the violations in question is massive: they do not only concern the violation of the fundamental rights of a single person, but of an enormous group of persons.”
This too should give the online media and advertising industry pause, because the same applies the broadcasting of personal data in RTB bid requests by the majority of major websites across the globe, and to the creation of profiles based on these personal data by DMPs and other adtech vendors.
Facebook’s notification fig leaf ruled unlawful
Facebook provided the following notice to users about this tracking:
Unsurprisingly, the Court ruled that this is utterly inadequate:
The court has come to the decision that in all the cases described, Facebook does not obtain any legally valid consent in the sense of Article 5 (a) Privacy Actand Article 129 ECAfor the disputed data processing.
As a result, the Court ruled that Facebook does not have a legal basis for tracking Internet users as they browse the web. Nor does Facebook have a legal basis for tracking logged-in users around the web.
Several of the Court’s admonitions are worth including here, because they are directly relevant to Facebook and other online media and adtech companies’ approaches to the GDPR.
First, the Court found that non-Facebook users are never told that their behavior on websites across the web is being profiled by Facebook:
When non-users visit a website of a third party that includes an (invisible) Facebook pixel that allows for tracking of browsing behavior, without indicating that they wish to make use of the Facebook service, no information mechanism (such as a banner) is displayed.
This remains a legal risk for Facebook, and “Clear History” does not adequately mitigate this risk.
Second, the Court ruled that Facebook’s request for consent was not specific, and that any consent that it received was unlawful as a result:
‘Specific’ means that the expression of will must related to a specific instance or category of data processing and can thus not be obtained on the basis of a general authorization for an open series of processing activities.
This part of the ruling was based on Article 1, section 8, of the Belgian Privacy Act, which uses the same formula of words as Article 4, paragraph 11, of the GDPR (“freely given, specific, informed…”). In other words, the Court is upholding a standard that is virtually identical to the standard that will apply under the GDPR. Facebook’s new GDPR consent dialogue faces the same problem, and is unlawful for the same reason.
Third, the Court found that Facebook users are not clearly told what “purposes” Facebook processes the personal data for. Nor does it clearly explain its use of sensitive data including any personal data that could reveal religious belief, sexual orientation, etc.:
Facebook has recently gone some way to inform users about the use of personal data concerning their political interests, but this is only a partial solution to a far broader risk for the company. Its handling of sensitive categories of personal data will be a major challenge, which it has yet to show any ability to resolve.
Fourth, and unsurprisingly in the aftermath of the Cambridge Analytica scandal, the Court found that Facebook did not properly disclose who it was sharing the data with. Nor did it provide any information about “the existence of a right to access and correction of the personal data concerning him”. This is likely to remain a significant challenge.
Fifth, the Court found that Facebook was not even complying with its own self-regulatory system. Whatever one’s view of the “adchoices” self-regulatory system, it is quite remarkable that Facebook continued to track people even if they had already used it to opt out.
Facebook forced to delete data (and fined)
The Brussels Court ordered Facebook to pay €250,000 per day, up to a maximum of €100 million, until it stopped its unlawful behavior.
This was a strong statement. To put this fine in to perspective, consider that Belgium has a population of 11.35 million people, which is only 2% of the population of the EU. At the same value per person, the EU equivalent would be €12.5 million per day, up to a maximum of €5 billion.
In addition, Facebook was ordered to submit to an independent expert supervising its deletion of all illegal data that it had amassed about every user on Belgian soil. It also had to make sure that third parties to whom it provided illegal data do the same.
The Cambridge Analytica scandal shows that this last point about insuring that third parties delete their copies of Facebook’s illegally accumulated data will be impossible for Facebook to comply with, because of its lax data sharing standards. Recall that Mark Zuckerberg told US lawmakers
When developers told us they weren’t going to sell data, we thought that was a good representation. But one of the big lessons we’ve learned is that clearly, we cannot just take developer’s word for it.
In other words, Facebook was sharing personal data without any control whatsoever, much as websites do when they send visitors’ personal data in RTB bid requests. Even if the original collection of the data had been lawful, this uncontrolled distribution would certainly is not. Again, the parallel with RTB bid requests should give publishers and adtech vendors pause.
What the Article 29 Working Party says, goes
Many of our colleagues in adtech have been unwilling to heed the counsel of the Article 29 Working Party (a roundtable of European regulators). The Brussels Court’s ruling illustrates the Working Party’s importance and authority. Although the Court is the arbiter, it relied on the Working Party’s authoritative opinions throughout its ruling. (The ruling cited the Working Party’s 2011 opinion on consent (15/2011), its 2010 opinion on online behavioral advertising (2/2010), its 2013 opinion on purpose limitation (2/2013), and its 2010 opinion on the concepts of data controller and data processor (1/2010).)
Whether or not businesses take the Working Party seriously, judges do, which is what matters when businesses find themselves facing sanctions for data misuse. This should demonstrate the value of closely abiding by the opinions of the Working Party. The requirements of European data protection law have been well illuminated by the public guidance of the Article 29 Working Party for over two decades, and provide an invaluable guide to businesses scrambling to comply with a body of law largely neglected hitherto.
Facebook can not reject users who refuse non-essential tracking
The Court ruled that Facebook cannot reject users who refuse to agree to tracking – unless the tracking in question is necessary for the service that a user explicitly requests from Facebook. Instead, the Court ruled that users should be
given the option of refusing the placement of these cookies, in as far as this is not strictly necessary for a service explicitly requested by him, without his access to the Facebook.com domain being hereby limited or rendered more difficult.
In December 2015, Facebook had blocked access to all Belgian users, following a court injunction that forbade it to place a (“Datr”) cookie without properly informing users. (See footnote 41 for elaboration.) Facebook attempted to justify this denial of service in a notice to users that claimed it could not provide service because was prohibited from taking measures (unlawful tracking) to prevent unauthorized access to users’ Facebook accounts. The Court took a dim view of this:
The court concurs … that the systematic collection of the personal data of users and non-users via social plug-ins on the websites of third parties is not essential (let alone “strictly essential” in the sense of Article 129 ECA),or at least not proportional to the achievement of the safeguarding objective.
The Court believed that Facebook’s purported fraud detection was insufficient in any case:
the systematic collection of safeguarding cookies is inadequate as a means of safeguarding, as it is easy to circumvent by persons with malicious intentions.
Conclusion: fewer data, not more, will help Facebook in the EU
This ruling is one of several defeats Facebook has suffered in European courts in recent months. In January, the Berlin Regional Court ruled that Facebook’s approach to consent and terms are unlawful. In April, the Irish High Court referred important aspects of Facebook’s trans-Atlantic transfers of personal data to the European Court of Justice, once again, for scrutiny. It is likely that worse is to come, unless it significantly changes its approach to data protection within the EU.
However, the company has options. As unlikely as it may seem now, one can foresee that Facebook will introduce non-personal data based ad targeting to the Newsfeed. This is likely to be necessary because Facebook will be unable to win lawful consent for some of its data processing purposes for sensitive personal data (or data processing purposes for regular personal data, that are not “compatible” with purposes that the user has already agreed to).
It seems likely that problem encompasses all personalized advertising on the newsfeed, custom audiences, and social share buttons on other websites. Therefore, Facebook must have a way of targeting ads to non-consenting users. Non-personal data would allow this.
It may also become important for Facebook to be able to participate in a clean and safe data supply chain, which major advertisers are beginning to show concern about.
In addition, Facebook will have to limit the use of custom audiences to situations where it is certain that the advertiser has a valid legal basis.
There is a broader lesson. Digital publishers and adtech vendors need to urgently reassess the use of personal data in programmatic advertising, and reflect on how adtech’s shaky consent systems will fare in Europe’s courts.
 The new terms mention personalization of ads. See “Terms of service”, Facebook (URL: https://www.facebook.com/legal/terms/update), accessed 2 May 2018.
The Terms also refer to the data policy, which elaborates that “we use the information we have about you – including information about your interests, actions and connections – to select and personalise ads, offers and other sponsored content that we show you.” The data policy also says “We use the information [including] the websites you visit and ads you see … to help advertisers and other partners measure the effectiveness and distribution of their ads and services, and understand the types of people who use their services and how people interact with their websites, apps and services”. “Data policy”, Facebook (URL: https://www.facebook.com/about/privacy/update), accessed 2 May 2018.
 The GDPR, Article 5, paragraph 1.
 The GDPR, Article 7, paragraph 2.
 The GDPR, Article 13, paragraph 1 and paragraph 2.
 “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
 “Getting feedback on new tools to protect people’s privacy”, Facebook, 1 May 2018 (URL: https://newsroom.fb.com/news/2018/05/clear-history-2/).
 See the GDPR, Article 6, Article 8, and Article 9.
 “Getting feedback on new tools to protect people’s privacy”, Facebook.
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance (Nederlandstalige Rechtbank van Eerste Aanleg te Brussel/Tribunal de Première Instance néerlandophone de Bruxelles – the “Court”), 16 February, 2016/153/A (URL: https://johnnyryan.files.wordpress.com/2018/04/belgian-court-judgement.pdf).
[10 ]It has since changed its name to the Belgian Data Protection Authority.
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 12.
 ibid., p. 9. See more detail on pp 49-51.
 ibid., p. 9.
 “Data leakage in online advertising”, PageFair (URL: https://pagefair.com/data-leakage-in-online-behavioural-advertising/).
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 69.
 ibid., p. 69.
Note that this raises the competition (antitrust) question, as Germany’s competition regulator, Andreas Mundt, has pointed out: “If Facebook has a dominant market position, then the consent that the user gives for his data to be used is no longer voluntary” (see https://www.reuters.com/article/us-facebook-privacy-germany/facebooks-hidden-data-haul-troubles-german-cartel-regulator-idUSKBN1HU108).
 ibid., p. 8.
 Which implemented the Data Protection Directive.
 Electronic Communications Act of 20 June 2005, which implemented the ePrivacy Directive.
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 64.
 ibid., p. 73-4.
 ibid., p. 57
 ibid., p. 61.
 ibid., p. 58.
 See discussion of special categories of data in the newsfeed in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
 ibid., p. 59.
 See testimony by Chris Vickery at the UK Parliament Digital, Culture, Media and Sport Committee Wednesday 2 May 2018 (URL: https://www.parliamentlive.tv/Event/Index/0cf92dd0-f484-4699-9e01-81c86acb880c)
 ibid., p. 63.
 ibid., p. 70.
 World Bank, 2016.
 Eurostat, population on 1 January 2017 (URL: ec.europa.eu/eurostat/tgm/table.do?tab=table&plugin=1&language=en&pcode=tps00001)
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 14, 70.
 Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook, Hearing before the United States House of Representatives Committee on Energy and Commerce, 11 April 2018 (URL: https://www.c-span.org/video/?443490-1/facebook-ceo-mark-zuckerberg-testifies-data-protection&live&start=4929#).
 Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., pp. 57-8, 61.
 ibid., p. 59.
 ibid., p. 60.
 ibid., p. 70.
 ibid., p. 13, 72.
 ibid., p. 72. See the Privacy Commission’s argument for this on p. 13.
 After an order from the Privacy Commission, which was backed up by a Court injunction. In 2015, the Privacy Commission ordered Facebook to, among other things, stop tracking non-users, using cookies and social plugs, without consent, and to do the same for users unless “unless strictly necessary for a service explicitly requested b the user” or unless it gets “unequivocal, specific consent”. It was also ordered to use consent requests that are unequivocal and specific. When Facebook failed to comply this was followed by a court order in November 2015. Facebook responded by blocking access to users. See ibid., pp 4-7.
 ibid., p. 65-6.
 ibid., p. 67.
 Judgment of the Berlin Regional Court dated 16 January 2018, Case no. 16 O 341/15 (URL: https://johnnyryan.files.wordpress.com/2018/04/berlin-court-judgement-german.pdf)
 The High Court, Commercial, 2016, N. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximilian Schrems, Request for a preliminary ruling, Article 267 TFEU, 12 April 2018.
See also Judgement of Ms Justice Costello, The High Court, Commercial, 2016, No. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximillian Schrems, 3 October 2017.
Note, this is the second “Schrems” case. The first caused the end of the EU-US Safe Harbor agreement.
 See a discussion on Facebook and purpose limitation in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
 “WFA Manifesto for Online Data Transparency”, World Federation of Advertisers, 20 April 2018 (URL: https://www.wfanet.org/news-centre/wfa-manifesto-for-online-data-transparency/). See also Stephan Loerke, WFA CEO, “GDPR data-privacy rules signal a welcome revolution”, AdAge, 25 January 2018 (URL: adage.com/article/cmo-strategy/gdpr-signals-a-revolution/312074/).