GUIDE

Data Brokers in Europe: GDPR, UK Law, Germany, France — and the US Surveillance Risk Nobody Warned You About

Europe has the strongest privacy law in the world. The General Data Protection Regulation — GDPR — gives every EU resident the right to access their data, correct it, delete it, object to its use, and complain to a regulator with the power to impose fines of up to €20 million or 4% of a company's global annual turnover. In practice, that ceiling is far higher for the companies that matter most: 4% of Meta's 2023 revenue is roughly $5.4 billion.

And yet, data brokers operate legally across Europe, marketing firms build profiles on hundreds of millions of EU residents, and US cloud companies process European personal data under a legal framework that privacy advocates are already challenging in court — for the third time. This article explains what the law actually guarantees, how Germany and France push beyond the minimum, why US surveillance law remains an unresolved structural risk, who the major data brokers holding EU data are, and how to use your GDPR rights against them — including where to submit opt-out requests via our EU data broker opt-out directory. See also our Data Broker Ecosystems hub for a broader view of the landscape.

Free Snapshot Scan

Want to see what brokers currently hold on you?

A Snapshot Scan checks your name and email across public sources, breach databases, and data broker listings. You receive a 1-page exposure summary within 48 hours — no commitment, no payment.

Request a Free Snapshot Scan

GDPR — What You Actually Have the Right to Do

The GDPR (Regulation 2016/679, effective 25 May 2018) applies to any organisation that processes personal data of individuals in the EU, regardless of where the organisation is based. A US data broker processing data on French residents must comply with GDPR.

Your Seven Core Rights

  • Right of Access (Art. 15): You can request a copy of all personal data an organisation holds about you, along with the purposes of processing, data sources, and how long it will be retained. Free of charge. Response required within 30 days.
  • Right to Rectification (Art. 16): You can demand correction of inaccurate data and completion of incomplete data without undue delay.
  • Right to Erasure — "Right to Be Forgotten" (Art. 17): You can require deletion when: the data is no longer needed for its original purpose; you withdraw consent and no other legal basis exists; you object under Art. 21 and the organisation has no overriding grounds; the data was processed unlawfully; or it was collected about you as a child. This right has limits — legal obligations, public interest, and legal claims can override it — but for most data broker use cases, it applies.
  • Right to Restriction (Art. 18): You can demand that processing is paused — data is stored but not used — while a dispute about accuracy or lawfulness is resolved.
  • Right to Data Portability (Art. 20): Where processing is based on consent or contract and carried out by automated means, you can receive your data in a structured, machine-readable format and transfer it elsewhere. Does not apply to processing based on legitimate interest.
  • Right to Object (Art. 21): For direct marketing (including profiling), this right is absolute — the organisation must stop, no questions asked, no balancing exercise required. For other processing based on legitimate interest, you can object on grounds specific to your situation, and the organisation must stop unless it can demonstrate compelling legitimate grounds that override yours.
  • Right Not to Be Subject to Automated Decision-Making (Art. 22): You cannot be subjected to decisions made solely by automated systems — including profiling — that produce legal or similarly significant effects, unless you have given explicit consent, it is necessary for a contract, or it is authorised by law with adequate safeguards.

The Data Broker's Legal Shield — Legitimate Interest

Data brokers almost universally rely on Article 6(1)(f) — legitimate interest — as their lawful basis for processing. This requires demonstrating: a legitimate purpose; that processing is necessary to achieve it; and that the purpose is not overridden by individuals' rights and freedoms. It is the most flexible of GDPR's six lawful bases and the most contested.

The problem: legitimate interest requires a balancing test that most brokers perform internally, without external scrutiny. The burden of demonstrating it is on the controller, not on you to disprove it. This is why the right to object (Art. 21) is so important — for direct marketing, you do not need to engage with the broker's justification at all. Your objection ends the processing, immediately.

The LinkedIn ruling illustrates this clearly. In November 2024, the Irish Data Protection Commission fined LinkedIn €310 million for relying on legitimate interest to process member data for targeted advertising. The DPC found LinkedIn's balancing test was inadequate — members' right to privacy outweighed LinkedIn's commercial interest in behavioural advertising. The case took years to resolve but established that a superficial legitimate interest assessment will not withstand regulatory scrutiny.

The Penalty Structure

GDPR penalties operate on two tiers under Article 83:

  • Lower tier (Art. 83(4)): Up to €10 million or 2% of global annual turnover, whichever is higher — for breaches of controller and processor obligations, certification requirements, etc.
  • Upper tier (Art. 83(5)): Up to €20 million or 4% of global annual turnover, whichever is higher — for breaches of core principles (Art. 5), lawful bases (Art. 6–9), data subject rights (Art. 12–22), and cross-border transfer rules (Art. 44–49).

For the largest tech companies, the turnover-based cap is the operative figure — and it is enormous. The €1.2 billion fine against Meta in May 2023 (for illegal trans-Atlantic data transfers) demonstrates that the theoretical maximum is not merely theoretical.

UK — After Brexit, Where Does This Leave You?

When the UK left the EU, it retained GDPR as "UK GDPR" via the European Union (Withdrawal) Act 2018, running alongside the Data Protection Act 2018. The substantive framework is nearly identical to EU GDPR, with the Information Commissioner's Office (ICO) replacing EU supervisory authorities.

UK maximum fines mirror EU GDPR: up to £17.5 million or 4% of global annual turnover. Notable ICO enforcement:

  • British Airways — £20 million (2020): 2018 breach of payment and personal data for ~400,000 customers. Originally proposed at £183 million; reduced partly due to COVID-19 economic impact.
  • Marriott International — £18.4 million (2020): Starwood hotel booking system breach affecting ~339 million records globally, including approximately 7 million UK residents.
  • TikTok UK — £12.7 million (2023): Unlawful processing of data from approximately 1.4 million UK children under 13 without parental consent.
  • Clearview AI — £7.5 million (2022): Scraping facial images of UK residents from social media without lawful basis. Ordered to delete data.

The Adequacy Risk

The European Commission granted the UK an adequacy decision in June 2021, allowing data to flow from the EU to the UK without additional safeguards. This was valid for four years, subject to review — meaning the Commission's assessment was due by June 2025.

The Data Protection and Digital Information (DPDI) Bill — initially a Conservative government initiative later revised by the Labour government — proposed significant divergences from EU GDPR, including: a whitelist of pre-approved "legitimate interest" activities requiring no balancing test; a less prescriptive "Senior Responsible Individual" instead of a mandatory DPO; relaxed cookie consent (opt-out rather than opt-in for analytics); and more permissive rules on international data transfers. EU Commission and privacy advocates warned that sufficient divergence would trigger a reassessment of adequacy — which would mean UK businesses would need alternative transfer mechanisms for EU data, and EU businesses could no longer freely transfer data to the UK. The adequacy question remains the central political and legal tension in UK data protection.

Germany — Europe's Most Demanding Privacy Jurisdiction

Germany supplements GDPR with the Bundesdatenschutzgesetz (BDSG 2018) and has a uniquely complex enforcement structure: one federal authority (BfDI) plus 16 separate state data protection authorities, each supervising businesses in their territory. The collective — the Datenschutzkonferenz (DSK) — issues joint guidance that often goes further than the minimum GDPR requires.

Key BDSG Additions

  • Lower DPO threshold: Germany requires a Data Protection Officer where more than 20 employees are consistently involved in automated personal data processing — lower than GDPR's own threshold, which focuses on "large-scale" or "systematic monitoring."
  • Employee data protection (Section 26 BDSG): Among the strictest employee data rules in Europe. Data collected in the employment relationship must be strictly necessary, and consent from employees is viewed with scepticism — the power imbalance in the employment relationship means truly voluntary consent is difficult to establish.
  • Special category data: Germany exercises GDPR's opening clauses to maintain particularly strict conditions for sensitive data in the employment and social welfare contexts.

Notable German Enforcement Cases

  • H&M — €35.3 million (Hamburg DPA, 2020): H&M's service centre kept detailed records of employees' private lives gathered through "Welcome Back" conversations after absences — notes on health conditions, family problems, religious beliefs, relationship difficulties. Supervisors had access to years of records on hundreds of employees. This remains one of the largest employee privacy fines in GDPR history.
  • Deutsche Wohnen — €14.5 million (Berlin DPA, 2019): The property company retained tenant personal data — salary statements, bank records, employment contracts, health insurance data — indefinitely without reviewing whether retention was still necessary. A textbook storage limitation violation.
  • SCHUFA — CJEU ruling (December 2023, C-634/21): Germany's dominant credit bureau generates automated credit scores used by lenders, landlords, and telecoms companies. The Court of Justice of the EU ruled that SCHUFA's scores constitute "automated individual decision-making" under Art. 22 GDPR — because third parties treat the score as determinative without human review — triggering the right to human intervention, explanation, and challenge. Separately, the CJEU ruled SCHUFA cannot retain bankruptcy data for three years when the public insolvency register only retains it for six months, breaching the purpose limitation principle.

Germany and US Cloud Providers

The DSK and several state DPAs (particularly Bavaria, Berlin, and Baden-Württemberg) issued the most restrictive opinions in Europe on US cloud transfers before the EU-US Data Privacy Framework. Multiple state DPA opinions stated that processing sensitive personal data of EU residents on US cloud infrastructure was impermissible even with Standard Contractual Clauses — because US surveillance law (specifically FISA Section 702) prevents US cloud providers from guaranteeing GDPR compliance. Post-DPF, the formal prohibition softened but the DSK has maintained that the underlying structural risk remains unresolved.

France's Commission Nationale de l'Informatique et des Libertés (CNIL) predates GDPR by 40 years — the original Loi Informatique et Libertés was enacted in 1978. CNIL has approximately 250 staff, handles thousands of complaints annually, and has built a reputation as one of the most aggressive enforcers of consent requirements, particularly for cookies and online advertising.

The Cookie Enforcement Campaign

CNIL's systematic enforcement against cookie consent failures has produced some of the most visible GDPR fines in Europe. The pattern is consistent: companies made it easy to accept all cookies (one click) and difficult to refuse them (multiple clicks, buried settings). CNIL ruled this violates the requirement for equally easy opt-out:

  • Google.fr and YouTube.fr — €150 million (January 2022): Refusing cookies required three clicks; accepting required one.
  • Facebook.com — €60 million (January 2022): Same asymmetry in the reject mechanism.
  • Bing (Microsoft) — €60 million (December 2022): Cookie rejection required multiple steps compared to one-click acceptance.
  • Amazon.fr — €35 million (December 2020): Cookies deposited before consent was obtained; information provided was insufficient.
  • Apple (App Store) — €8 million (December 2022): Advertising identifiers used for targeted ads without valid prior consent.
  • TikTok — €5 million (January 2023): Cookie consent mechanism failed the easy opt-out requirement.

Cookie Walls

CNIL has ruled that "cookie walls" — blocking access to a website unless a user accepts tracking cookies — are generally incompatible with freely given consent. Consent is not valid when the alternative is being denied access. CNIL permits "pay or consent" models only in limited circumstances where a genuinely free alternative (without advertising tracking) is available.

France's Approach to the Right to Be Forgotten

CNIL brought enforcement action against Google over its approach to delisting requests — removing search results from EU versions of its search engine (google.fr, google.de) but not from google.com. CNIL argued global delisting was required; Google argued GDPR is geographically limited. The CJEU ruled in 2019 (C-507/17) that GDPR does not mandate global delisting as a general rule — but EU DPAs can require broader delisting where fundamental rights are at serious risk.

This is the section that matters most for anyone evaluating their digital exposure in 2026. The short version: transfers of EU personal data to the United States are currently technically legal for certified companies under the EU-US Data Privacy Framework. But the framework rests on a legal and political foundation that is likely to be challenged and potentially struck down for the third time — and FISA Section 702, the US surveillance law at the heart of the dispute, was not only left unchanged but expanded.

The Legal Framework — Three Mechanisms

EU-US Data Privacy Framework (DPF, July 2023): The European Commission's adequacy decision for the US, adopted on 10 July 2023. US companies self-certify compliance with DPF principles through the Department of Commerce. Certified companies can receive EU personal data without additional transfer safeguards. The DPF replaced Privacy Shield, which was itself a replacement for Safe Harbor.

Standard Contractual Clauses (SCCs): Pre-approved contract terms (updated June 2021) that can be used for transfers to countries without an adequacy decision. Contractually bind the importing party. However, SCCs cannot override the law of the destination country — if US law requires an importer to hand over data to intelligence agencies, the SCCs do not prevent this. Transfer Impact Assessments (TIAs) are mandatory alongside SCCs: controllers must evaluate whether the destination country's law allows meaningful compliance with SCC obligations.

Binding Corporate Rules (BCRs): Internally approved codes of conduct for multinational corporate groups, authorised by a lead DPA. Only available for intra-group transfers. Expensive and slow to approve (12–24 months). Used primarily by large multinationals for employee data.

The Problem That Never Went Away: FISA Section 702

Section 702 of the Foreign Intelligence Surveillance Act authorises US intelligence agencies to compel US electronic communications service providers to assist in acquiring intelligence about non-US persons outside the United States — without individual judicial approval, without notifying the person targeted, and without EU legal process. Two programmes operate under FISA 702:

  • PRISM: The NSA acquires data directly from major US technology companies (Microsoft, Google, Apple, Meta, Yahoo, and others) about specific foreign targets.
  • UPSTREAM: The NSA accesses internet traffic in bulk directly from backbone internet infrastructure.

This is the structural problem. Microsoft Azure, Amazon AWS, and Google Cloud all process EU personal data on behalf of EU businesses. All three are US companies subject to FISA 702. They can be compelled to hand over EU data to US intelligence agencies without telling the affected individuals, without EU judicial oversight, and without any mechanism for those individuals to challenge it.

FISA 702 was reauthorised by the US Congress in April 2024 and expanded — broadening the definition of "electronic communications service providers" to potentially capture any business with access to equipment through which communications pass. Not narrowed. Expanded.

The Cloud Act

The CLOUD Act (2018) extends the problem to law enforcement. It allows US law enforcement agencies — not just intelligence agencies — to demand data from US cloud providers regardless of where the data is physically stored. A US federal court can issue a warrant requiring Microsoft to produce data stored on European servers. There is a challenge mechanism, but it is discretionary, not guaranteed, and EU data subjects have no automatic rights in the process.

Schrems I, II, and the Coming III

Max Schrems, the Austrian privacy activist and founder of noyb.eu (None of Your Business), has successfully challenged US-EU data transfer frameworks twice:

  • Schrems I (2015, C-362/14): Struck down Safe Harbor — the original US-EU adequacy decision — finding it did not provide "essentially equivalent" protection to EU law because US surveillance programmes operated without adequate safeguards.
  • Schrems II (2020, C-311/18): Struck down Privacy Shield for the same reason — FISA 702 allowed mass surveillance without individualised judicial authorisation, EU citizens had no effective judicial redress in the US, and the Privacy Shield Ombudsman was not independent. SCCs were upheld as a mechanism but required case-by-case TIA assessment, effectively making US transfers via SCCs extremely difficult to justify.
  • Schrems III (in progress): noyb and Schrems filed challenges to DPF immediately after its July 2023 adoption, arguing that FISA 702 remains fundamentally unchanged, that Executive Order 14086's Data Protection Review Court is not an independent judicial body meeting EU Charter Art. 47 standards, and that the "necessary and proportionate" standard in the EO uses US definitions, not EU law definitions. A referral to the CJEU is expected; a ruling striking down DPF would retroactively invalidate all data transfers made under it — creating catastrophic legal uncertainty for trans-Atlantic data flows.

The Real-World Consequences: Case Studies

Google Analytics: Before DPF, Austrian, French, Italian, Belgian, and Danish DPAs all ruled that use of Google Analytics on EU websites was illegal because it transferred data to the US without adequate safeguards. French users' IP addresses and browser data were transmitted to Google's US servers, where FISA 702 applied. The Austrian DPA's January 2022 ruling set off a wave of similar decisions across Europe. After DPF, the situation became more permissive for DPF-certified Google, but the underlying legal risk remains as long as Schrems III is unresolved.

Meta €1.2 billion fine (May 2023): The Irish Data Protection Commission fined Meta the largest single GDPR fine in history — €1,200,000,000 — for transferring Facebook users' personal data to the US (for processing on Meta's US infrastructure) without adequate safeguards. Meta had been relying on SCCs, which the DPC found inadequate given FISA 702's reach. Meta was ordered to suspend trans-Atlantic transfers within five months and delete improperly transferred data within six months. Meta continued under DPF after its July 2023 adoption.

Clearview AI — Fined and Ignored: Clearview AI, a US company, scraped billions of facial images from the internet — including from EU social media, news sites, and public sources — to build a facial recognition database licensed to law enforcement. It had no EU establishment and no legal basis for collecting biometric data on EU residents.

AuthorityFineDateOrder
France CNIL€20,000,000Oct 2022Delete French residents' data
Italy Garante€20,000,000Mar 2022Delete Italian residents' data, stop processing
Greece HDPA€20,000,000Jul 2022Delete Greek residents' data
UK ICO£7,500,000May 2022Delete UK residents' data
Sweden IMYSEK 2,500,000 (~€220K)Nov 2021Police Authority prohibited from using Clearview

Clearview has ignored every EU and UK fine. It has no EU assets to seize. This illustrates the fundamental enforcement gap for US companies with no EU establishment — GDPR's jurisdictional reach is broader than its practical enforcement power.

GDPR Enforcement — The Top Fines

Entity Fine Authority Date Violation
Meta (Facebook)€1,200,000,000Ireland DPCMay 2023Illegal US data transfers
Amazon Europe€746,000,000Luxembourg CNPDJul 2021Cookie consent / ad processing
Meta (Instagram)€405,000,000Ireland DPCSep 2022Children's data
Meta (Facebook)€390,000,000Ireland DPCJan 2023Forced consent for behavioural advertising
TikTok Technology€345,000,000Ireland DPCSep 2023Children's data, default privacy settings
LinkedIn Ireland€310,000,000Ireland DPCNov 2024Legitimate interest misuse, targeted advertising
WhatsApp Ireland€225,000,000Ireland DPCSep 2021Transparency failures
Meta (Facebook)€265,000,000Ireland DPCNov 2022Scraped data breach exposure
Google LLC€150,000,000France CNILJan 2022Cookie consent asymmetry
Microsoft (Bing)€60,000,000France CNILDec 2022Cookie consent
Facebook Ireland€60,000,000France CNILJan 2022Cookie consent
Google LLC€102,000,000France CNILDec 2020Cookie consent
H&M€35,300,000Hamburg (Germany)Oct 2020Employee monitoring
Amazon€35,000,000France CNILDec 2020Cookies placed before consent
Clearview AI€20,000,000 (×3)France, Italy, Greece2022Unlawful facial recognition scraping
Deutsche Wohnen€14,500,000Berlin DPANov 2019Excessive data retention
British Airways£20,000,000UK ICOOct 2020Data breach (400K customers)
Marriott International£18,400,000UK ICOOct 2020Data breach (339M records)
TikTok UK£12,700,000UK ICOApr 2023Children's data
Apple€8,000,000France CNILDec 2022App Store advertising consent

Skip the manual work

Exercising these rights across dozens of brokers takes time.

The Eraser handles opt-outs across 150+ data broker sites, drafts the objection requests, monitors for re-listing, and delivers a verified removal report. Or start with a free Snapshot Scan to see which brokers currently hold your data.

The Eraser — €3,800 Or start with a free Snapshot Scan

EU Data Broker Opt-Out Directory

We maintain a full directory of 75+ European data brokers — AdTech vendors, B2B data aggregators, and people-search intermediaries — with direct links to every privacy and erasure request page. If you want to act on the rights described above, start here.

EU Broker Opt-Out Directory Full Opt-Out Guide (100+ Brokers)

How to Use Your GDPR Rights Against Data Brokers

Step 1 — Subject Access Request (Art. 15)

Send a written request (email is fine) to the data broker's Data Protection Officer or privacy team. State that you are making a Subject Access Request under Article 15 GDPR. Include your full name and any identifiers they might hold (email address, previous addresses, date of birth). Do not pay a fee unless they can justify it. They have 30 days to respond.

Step 2 — Right to Object to Direct Marketing (Art. 21(2))

State in writing that you object to processing of your personal data for direct marketing purposes, including profiling related to direct marketing. This right is absolute — they must stop. There is no balancing exercise, no legitimate grounds override. Cite Art. 21(2) explicitly. They must stop "without undue delay."

Step 3 — Right to Erasure (Art. 17)

If you have objected and they have stopped direct marketing use but still hold your data, request erasure. The most common grounds for data brokers: the data is no longer necessary for the purposes for which it was collected (their original collection purpose likely did not include your current use case); or you have objected under Art. 21 and there are no overriding legitimate grounds. They have 30 days.

Step 4 — Complain to Your National DPA If They Don't Comply

If a controller ignores your request, delays beyond 30 days without valid justification, or refuses without adequate grounds, file a complaint with your national data protection authority. Under Art. 77, you have the right to complain in the Member State of your habitual residence, your place of work, or where the alleged infringement occurred. This costs nothing.

CountryAuthorityComplaint URL
GermanyBfDI + relevant Landesbehördebfdi.bund.de
FranceCNILcnil.fr/fr/plaintes
NetherlandsAutoriteit Persoonsgegevensautoriteitpersoonsgegevens.nl
IrelandData Protection Commissiondataprotection.ie
ItalyGarantegaranteprivacy.it
SpainAEPDaepd.es
BelgiumAPD/GBAdataprotectionauthority.be
AustriaDatenschutzbehördedsb.gv.at
SwedenIMYimy.se
UKICOico.org.uk/make-a-complaint

Collective Action — noyb.eu

Under Art. 80 GDPR, non-profit organisations can bring complaints on behalf of data subjects or independently. noyb.eu (None of Your Business), founded by Max Schrems, files strategic complaints against major platforms and data brokers across multiple EU jurisdictions simultaneously. Reporting a violation to noyb may result in a coordinated complaint that carries more enforcement weight than an individual filing.

The Bottom Line

GDPR gives European residents the most comprehensive set of privacy rights anywhere in the world. The right to object to direct marketing use of your data is absolute. The right to access and request deletion is real and enforceable. Fines are large enough to matter even for the biggest companies in the world.

But the law has limits. Data brokers routinely stretch legitimate interest to its breaking point. US surveillance law creates a structural conflict that three EU-US transfer frameworks have not resolved and that a likely Schrems III ruling may invalidate again. Clearview demonstrates that GDPR's reach is only as strong as its enforcement against companies with no EU presence and no EU assets. And even where rights formally exist, exercising them requires awareness, persistence, and in many cases a willingness to escalate to national regulators.

The most effective protection is not a rights request sent to a company that will re-acquire your data within months. It is reducing how much data exists about you in the first place. Our free opt-out guide and EU broker directory provide a starting point. If you are weighing paid tools, our country-by-country comparison of data broker removal services in Europe covers which platforms actually work in France, Germany, the Netherlands, Spain and the UK. For significant or ongoing exposure, a professional removal service handles the GDPR requests, monitors for re-listing, and documents each deletion.

Resources

Related Service

The Eraser€3,800

Manual removal from 500+ data brokers, Google search suppression, social media archive cleanup, and a 90-day re-scrub guarantee.

Start Erasure — €3,800 Or Get a Free Exposure Check

Share this briefing

If this was useful, sharing it helps others protect themselves. It also helps keep the intelligence briefings free.