ANALYSIS

Right of Access as Reconnaissance: The Article 15 Verification Gap

The right of access is the most-cited GDPR provision and the least operationally audited. Every data subject in the EU and UK can ask any controller, free of charge, for a full copy of the personal data that controller holds about them — and most controllers have 30 days to comply. Non-response risks regulatory fines. Over-verification risks the same. Under-verification risks something regulators have not yet prosecuted at scale: releasing personal data to the wrong person on the strength of a credible-looking request.

What these controllers hold turns the verification gap into a targeting problem. Credit bureaus, identity-risk providers, and data enrichment services maintain exactly the datasets a competent attacker would build as a phishing, impersonation, or CEO-fraud toolkit: full names, home and previous addresses, dates of birth, phone numbers, email addresses, financial history, account relationships, transaction patterns, bank identifiers, employment records — and at the industrial end, device profiles and behavioural signals. An Art 15 response from a single controller in this sector is not a transparency document. It is a fully assembled reconnaissance file, delivered to the requester's chosen address in a machine-readable format, on a regulated timeline.

From an attacker's perspective, the economics are stark. Breaching a database is work — reconnaissance, infrastructure, operational security, real prosecution risk. Filing twenty forged subject access requests is email, a plausible name, and thirty days of patience.

That gap is the subject of this article. Article 15 was written to shift power toward data subjects. Implemented inconsistently, it also creates a pre-authenticated data exfiltration channel — one that understaffed controllers are structurally biased to use rather than close, because the cost of non-response is visible and the cost of over-disclosure is not.

The gap will close. NIS2 Art 20 makes management-body members personally accountable for organisational cybersecurity failures, and the first cases will hit controllers whose Art 15 pipelines leaked data to a credible-looking attacker. That will tighten the sector faster than any DPA enforcement has to date. The question is what happens in the 18 to 24 months before that starts.

What Article 15 actually does

GDPR Art 15 grants every data subject the right to obtain, from any controller: confirmation of whether their personal data is being processed, a copy of the personal data being processed, and information about the purposes, categories, recipients, retention periods, source of the data, and whether automated decision-making is in use. UK GDPR Art 15 is substantively identical.

The controller has one calendar month to respond under Art 12(3), extendable by two months "where necessary, taking into account the complexity and number of the requests." Responses must be free of charge unless requests are "manifestly unfounded or excessive." The controller must verify the requester's identity — but only "proportionately."

The EDPB's Guidelines 01/2022 on data subject rights, adopted in January 2022 and finalised in April 2023, set out the verification standard. The key language: requesting a copy of an identity document is "generally inappropriate unless strictly necessary, suitable, and in line with national laws." Verification must be proportionate to the sensitivity of the data and the potential damage a wrongful disclosure could cause.

The Dutch Autoriteit Persoonsgegevens has published a layered three-tier position on SAR identity verification:

"You are never allowed to ask for a full copy of the identity document for this purpose. A full copy is a copy in which all personal data are visible. You are only allowed to ask for a full copy ID if you are obliged by law to do so."

"In all other cases, you are not allowed to ask for more than a copy ID in which certain data have been blocked. But this is only permitted if there really is no other way. In most cases, there are less intrusive ways to establish someone's identity. You must always try as much as possible to establish someone's identity using the data you already have from this person."

AP, Privacy rights in practice

The AP lists the less-intrusive methods a controller should try first: an existing login system, multi-factor authentication using customer data the controller already holds (for example, email confirmation of a telephone request, or asking for the last three digits of an account number plus date of birth), or a drop-by ID check without making a copy. A redacted ID document is the fallback — only after less-intrusive methods have been attempted, and only if none is available. DPG Media, following an AP decision, moved to email-based verification, which the AP confirmed is compliant.

The operational consequence: the most defensible verification is the least intrusive one that provides reasonable assurance the requester is the data subject. Email-based verification, supplemented by out-of-band channel confirmation, is the current European baseline. Document-based verification is the fallback, not the default.

But nothing in GDPR text specifies this baseline. "Proportionate" is left to each controller to interpret, with DPA case-law adjusting it post-hoc. The result is sector-wide inconsistency — which is what creates the attack channel.

The attacker view

In 2019, security researcher James Pavur presented research at Black Hat USA titled GDPArrrrr: Using Privacy Laws to Steal Identities. With his fiancée's explicit consent, Pavur submitted subject access requests to 150 companies using only her name and a forged email address. Eighty-three companies (72%) responded substantively. The full whitepaper is on arXiv.

The published results were stark. Twenty-four percent of the companies returned sensitive personal data on the strength of the forged email alone — including Social Security numbers, account passwords, home addresses, payment card digits, and travel history. Three percent deleted the account outright without any verification. Sixteen percent asked for weaker forms of identity Pavur had already anticipated. Thirty-nine percent requested further information he was unwilling to forge. Thirteen percent ignored the request.

More than 20 of the 150 companies released genuinely sensitive data to a requester who had provided nothing beyond a name and a credible-sounding email. The research was conducted within the first year of GDPR enforcement. It was published openly at the world's largest security conference. And the baseline has not materially changed.

Current threat-actor behaviour builds on Pavur's findings. Identity packs — structured bundles of personal data used for impersonation, social engineering, and executive targeting — are assembled from multiple sources: breach archives, stealer logs, data broker listings, and increasingly, forged SAR responses. A forged SAR to one controller often yields the context needed to forge a more credible SAR to a second, because the first response contains transaction dates, address history, or family member references the second controller uses as verification questions.

This is the cross-reference attack. Controller A's "confirm your last three transactions" question is answered by Controller B's SAR response. Controller B's "confirm your prior addresses" question is answered by Controller A's. A series of well-targeted SARs across five to ten controllers can build an identity pack that passes further verification with very little forgery risk.

Four conditions make this work:

  1. Thirty-day clock pressure. Controllers treat non-response as the risk because it is the visible risk. Over-disclosure is invisible unless the target notices and complains.
  2. Email verification as baseline. A forged email requires a compromised or look-alike domain — technically trivial for a prepared attacker.
  3. Analyst-discretion verification. "Reasonable proof" standards mean the tenth SAR of the day at 4 PM on a Friday is treated differently from the first SAR of the day at 9 AM Monday.
  4. Best-guess disclosure bias. When verification is weak but the request is credible, controllers default to disclose — because the documented decision ("we weren't certain but the request looked authentic") reads better in a DPA investigation than "we blocked a legitimate data subject's request."

None of these conditions are new. What is new is the increasing sophistication of identity-pack construction, the availability of stealer-log-derived verification material, and the incoming NIS2 personal-liability framework that will make wrongful disclosure a board-level problem.

Six controller archetypes from the field

We reviewed the published Article 15 procedures of six controllers active in the UK and EU markets, across three size tiers. Each illustrates a distinct failure mode — not because the controllers are uniquely at fault, but because the pattern each represents is widespread in its tier.

1. Procedural rigorist — Equifax UK

Equifax UK operates the most transparently documented SAR pipeline of the six. The privacy policy names a dedicated Data Protection Officer (UKDPO@equifax.com), lists multiple intake channels, and specifies response windows ranging from immediate (online, post-verification) to one calendar month (postal). Users can select from categorised data types — marketing services data, motor finance data 2007-2024, director/secretary records, customer service records, written communications, call records.

Verification for postal SARs requires one address document within three months and one form of government ID (passport or driving licence). Phone SARs require a 30-minute verification call. Online SARs require identity proofing before data release. Current and former employees are routed to a dedicated channel (HR.EmployeeDSAR@equifax.com).

This is the UK ICO-compliant gold standard. It is also, in strict AP terms, disproportionate: asking for passport or driving-licence copies exceeds the "never required as default" standard the AP applied to DPG Media. Equifax operates across both jurisdictions and the UK approach governs. The pattern illustrates that "proportionate" is read differently across DPAs, and cross-border controllers typically pick the stricter-sounding UK standard — meaning Dutch data subjects face more intrusive verification than AP guidance requires.

2. Diversion-first — Experian UK

Experian's Data Access Request portal exists and is well-documented. It is also surrounded by alternative routes that divert the requester before they reach the full DSAR. The policy actively suggests: the free Experian account for credit score, the paid CreditExpert subscription for credit report, the statutory report service, marketing preferences via a separate portal. Current employees are routed to Oracle; former employees to AccessHR@experian.com.

The full DSAR is available — but positioned as the last option, behind five lighter alternatives. Identity verification is mentioned only as "pass our security checks" with no public specification. Separately, Experian's own UK operations were the subject of the 2024 Upper Tribunal decision analysed in our UK GDPR data broker rights guide — which established that the ICO's systemic enforcement case against Experian's marketing datasets failed on notice and legal-basis grounds, though individual Art 15 and Art 21 rights were unaffected.

This is not deceptive. It is optimisation. Each diverted requester is one less manual file to handle and one less identity verification to execute. The pattern is legitimate but illustrates a structural bias: the controller's interest is in minimising full DSARs, not in facilitating them.

3. Compliant minimalist — LexisNexis Risk Solutions

LexisNexis' privacy rights section is a single paragraph. It enumerates the five rights (access, rectification, erasure, restriction, portability), states that requests should be submitted via the privacy centre, and notes that "we may require you to verify your identity." Nothing more.

This is defensible under Art 12 — the rights are disclosed, the channel exists. But the verification standard is entirely internal and entirely discretionary. Nothing in the public policy tells a data subject what to expect when they submit. Nothing in the public policy tells an auditor what standard LexisNexis applies. The controller retains maximum operational discretion at the cost of external accountability.

4. Data maximalist — Dun & Bradstreet (covers Bisnode)

Dun & Bradstreet's Data Subject Rights request pipeline is outsourced to TrustArc. The submission form asks for full name, validated email, phone number, prior companies associated with, position and address at each prior company, DUNS numbers where known, Eyeota cookie IDs, and any additional comments. The form explicitly acknowledges that "Dun & Bradstreet may not be able to match my name to a specific consumer or household without this additional information."

Three observations. The requester surrenders more identity data to make the SAR than the SAR may ultimately return. TrustArc sits in the processor chain and retains an encrypted record of preferences. Repeat requests for the same right within three months are declined.

This is verify-by-collecting — compliance through data maximalism rather than data minimisation. Under Art 5(1)(c), GDPR requires data collection to be adequate, relevant, and limited to what is necessary. The form's own acknowledgement that additional information may be needed to match a record suggests the data collected exceeds the minimum necessary for verification. Whether this passes a proportionality test is untested. The structure is characteristic of outsourced-compliance pipelines that lean maximalist to reduce controller liability. D&B's recent regulatory history is also relevant: the Spanish AEPD fined Informa D&B €1.8 million in January 2025 for Art 6 and Art 14 violations in its handling of 1.6 million sole-trader records.

5. Thin procedure — 192.com

192.com's rights section lists the four main rights (consent withdrawal, access, rectification, erasure/restriction/objection) in plain English. The intake is postal (36 Southwark Bridge Rd, London SE1 9EU) or email (feedback@192.com). Response window: 30 days. Fee: none. Verification: "we may require you to give reasonable proof of your identity."

This is the thinnest procedure of the six. There is no form, no portal, no published ID standard, no named DPO channel. The duty analyst on any given day exercises full discretion over what "reasonable proof" means. For low-sensitivity UK people-search data, this may be defensible. Applied to a controller holding financial, health, or employment data, the same procedure would be untenable.

The pattern is common at SME-tier controllers across the EU and UK. The rule applies, the procedure is thin, the operational standard is whatever the analyst decides on the day.

6. Industry-seal gap — CDDN B.V.

CDDN B.V. operates the Dutch Consumer Records Database. Their own published privacy statement opens:

"Privacy and security are serious matters and we therefore take the rules and boundaries very seriously. Privacy is becoming increasingly important, certainly in terms of consumer data."

CDDN B.V. Privacy & Cookie Statement (checked 24 April 2026)

The same policy describes the organisation's technical controls:

"CDDN B.V. takes the protection of your data seriously and takes appropriate measures to prevent abuse, loss, unauthorised access, inappropriate disclosure, and unauthorised alteration."

This positioning is publicly reinforced by the organisation's clients. Martin Huisman of NIMA — the Dutch national marketing association, which describes itself as the Netherlands' largest marketing network — writes in a testimonial published on cddn.nl:

"At an organisation where data plays the main role, all eyes are focused on integrity, privacy and ethics. CDDN understands this like no one else. By being the first to obtain the NIMA Marketing Integrity Quality Mark for companies, CDDN in my opinion proves that it has an exemplary role in the market for data services."

Huisman confirms the NIMA seal "is based on the current legislation and regulations" and that CDDN was the first organisation to receive it. NIMA itself is a CDDN client — CDDN maintains NIMA's member database.

UNICEF Nederland is also a published CDDN client, with a real-time API integration between CDDN and UNICEF's Microsoft Dynamics 365 donor database. UNICEF's testimonial on cddn.nl describes the relationship:

"What makes this collaboration truly valuable is the combination of technical quality, GDPR knowledge, and short communication lines. CDDN thinks along, switches quickly, and understands exactly the care needed within an organisation like UNICEF."

Three provisions of the same CDDN privacy policy, quoted verbatim, read against current European regulatory standards.

First — identity documents as the default SAR verification method. The policy states:

"To be certain that the request for inspection was submitted by you, we ask you to include a copy of your identity document with the request. Please blacken out your passport photo, MRZ (machine readable zone, the bar with numbers at the bottom of your passport), your passport number and civil service number (BSN)."

The AP's published guidance places a redacted ID copy at the bottom of the verification hierarchy, not the top:

"You must always try as much as possible to establish someone's identity using the data you already have from this person. […] You are not allowed to ask for more than a copy ID in which certain data have been blocked. But this is only permitted if there really is no other way."

AP, Privacy rights in practice

The AP requires a controller to first attempt less-intrusive verification using data the controller already holds — email confirmation of a telephone request, text-message confirmation of an email request, matching a known phone number, or asking for specific identifiers such as account-number digits plus date of birth. The redacted ID copy is the fallback, permitted "only if there really is no other way."

CDDN's own retention table documents the personal data it holds on data subjects: first name and surname, date of birth, address details, telephone number, email address, IP address, and bank account number. Every category of data the AP lists as a less-intrusive verification alternative is already in the controller's records. The policy nevertheless requests a redacted identity-document copy as the default — the fallback method, asked for before the less-intrusive methods the AP requires to be tried first.

The operational effect of that choice — intended or not — is deterrence. A redacted ID copy requires the requester to locate their passport, scan or photograph it, apply manual redactions to specific fields (passport photo, MRZ, passport number, BSN), and send the file. Each step is friction. The less-intrusive methods the AP lists — email confirmation, text confirmation, matching an account number digit plus date of birth — are near-zero friction. Choosing the highest-friction permitted option as the default does not verify identity any better than the alternatives the controller already has the data to execute. It does reduce the number of SARs the controller will ever have to process.

Second — the legal basis for transatlantic transfers. The policy states:

"Google states to comply with the Privacy Shield principles and is affiliated with the Privacy Shield programme of the American Department of Commerce. This implies that there is question of an appropriate level of protection for the processing of potential personal data."

And, separately:

"LinkedIn, Twitter, Facebook, and Google+ state to comply with the Privacy Shield principles and are affiliated with the Privacy Shield programme…"

The CJEU judgment in Schrems II (C-311/18), delivered 16 July 2020, invalidated Privacy Shield. The EU-US Data Privacy Framework, which replaced it, entered force 10 July 2023.

Third — retention. The policy's own retention table states, for the following categories:

First name and surname — indefinite period of time
Gender — indefinite period of time
Address details — indefinite period of time
Telephone number — indefinite period of time
Email address — indefinite period of time
IP address — indefinite period of time

GDPR Art 5(1)(e) requires that personal data be:

"kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed."

These three contrasts — between the organisation's own published self-description, the industry-body certification it holds, the charitable client that publicly credits its GDPR competence, and the regulatory standards its own published policy quotes as compliant — are the article's central observation in a single vignette.

History teaches: three times rule-on-paper became rule-in-practice

This is not the first regulatory rule that was technically complied with long before it was operationally complied with. Three historical cycles illustrate the pattern and the typical inflection point that closes the gap.

2002-2006: Sarbanes-Oxley and the first CEO/CFO prosecutions

The Sarbanes-Oxley Act was signed in July 2002. For the first three years, most Fortune 500 companies treated it as a documentation exercise: Section 404 internal-control reports, consultant-led readiness assessments, GRC tooling procurement. The Section 302 and 906 personal-certification requirements were on the books but untested.

Between 2005 and 2007, three prosecutions changed the conversation. Bernard Ebbers (WorldCom CEO) was convicted in March 2005 and sentenced to 25 years for his role in an $11 billion accounting fraud. Dennis Kozlowski (Tyco CEO) was sentenced in June 2005 to 8⅓ to 25 years. Jeffrey Skilling (Enron CEO) was convicted in May 2006 on 19 counts and reported to federal prison in December 2006. All three cases were prosecuted under authority that SOX had activated or amplified, and the fact pattern — personal liability for executive-level reporting failures — was visibly SOX-shaped.

After 2006, CFOs began personally reviewing ICFR. Audit committees began functioning as governance bodies rather than rubber stamps. The three-to-four year gap between the law and the first prosecution wave is approximately the gap we are currently in on NIS2 Art 20 personal liability.

2004-2009: PCI DSS and Heartland

The Payment Card Industry Data Security Standard launched in 2004. Most Level 1 merchants treated it as an annual Qualified Security Assessor review — Report on Compliance submitted, boxes ticked, certificate issued. The standard's technical requirements were partially implemented; the audit process was fully implemented.

In January 2009, Heartland Payment Systems disclosed a breach affecting 134 million card records. SQL injection into a public-facing application had given attackers access to install spyware that exfiltrated data over several months in 2008. Heartland had been certified PCI DSS compliant at the time of the breach. Compliance was lost after disclosure and not revalidated until May 2009.

The lesson documented across the industry after Heartland: being certified compliant does not mean being secure. Retailers accelerated tokenisation, point-to-point encryption, POS segmentation, and card-scheme-backed penalty structures. PCI DSS 2.0 (2010) and 3.0 (2013) tightened the attestation process in response.

The parallel to CDDN's NIMA certification is direct. Holding an industry-body certification at the time a regulator's standard is contradicted does not insulate the controller from the enforcement that follows. It may in fact accelerate it, because the seal raises the reasonable-expectation standard.

2018-2023: GDPR's own cycle, and why Art 15 is next

GDPR became enforceable in May 2018. The first 18 months saw regulators focus on the visible, procedural provisions: privacy notices (Art 13/14), records of processing (Art 30), consent banners (Art 7). Most controllers rebuilt these layers first because they were the most auditable.

The substantive enforcement wave began in late 2019. CNIL fined Google €50 million in January 2019 for inadequate legal-basis disclosure. Hamburg's data protection authority fined H&M €35.3 million in October 2020 — not for security failures but for collecting and processing employee private-life data (illness, family, religion) without valid legal basis at its Nuremberg service centre. Luxembourg CNPD fined Amazon €746 million in July 2021 for targeted advertising without valid legal basis (the fine has since been annulled on procedural grounds, though the Administrative Court largely sided with CNPD on the underlying GDPR violations). The Irish DPC fined Meta €1.2 billion in May 2023 for Article 44 transfer violations.

The pattern: regulators shifted from "does the notice exist" to "is the legal basis valid, is the processing proportionate, is the transfer lawful." Three to five years from law to substantive-compliance enforcement wave.

The third wave — Art 15, 17, 21 (access, erasure, objection) — is visible on the horizon but has not yet produced mega-fines. It will. The enforcement mechanics differ: access and erasure complaints are individual, procedural, and high-volume. They do not require DPAs to investigate systemic design failures. They require the DPA to confirm a single instance of non-compliance, and the controller loses.

This is the wave Art 15 is entering now. The attacker-channel dimension is the risk multiplier that will push it from procedural enforcement into NIS2 board-liability territory.

The NIS2 inflection point

NIS2 entered into force across the EU on 17 October 2024. Article 20 creates personal accountability for management-body members for the approval and oversight of organisational cybersecurity measures. Article 21(2) sets out the technical and organisational measures management bodies must ensure are in place, including (d) supply-chain security, (g) human-resources security and access control, and (h) basic cyber hygiene practices. The NIS2 risk vector is analysed in detail in our supply-chain exposure article.

The accountability window does not open the day NIS2 is in effect. It opens when the first controller is prosecuted under national implementation, and the prosecution is traceable to a specific management-body failure. Historical parallels suggest that window opens 18 to 36 months after transposition — so 2026 to late 2027 for the earliest NIS2 liability cases. The transition period will not be clean. Early enforcement will be inconsistent. Appeals will clarify the standard, as happened with the Amazon €746 million annulment.

Article 15 SAR verification sits squarely within Art 21(2)(h) (basic cyber hygiene) and arguably (g) (human-resources security, for employee-targeting SAR-based impersonation). A controller whose Article 15 pipeline leaks personal data to an attacker, and whose management body cannot demonstrate documented oversight of the verification standard, is exposed under both GDPR and NIS2. The penalty stacking is material: GDPR Art 15 breach fines up to 4% global turnover, plus NIS2 management-body personal fines, plus potential civil liability to the wrongfully-disclosed data subject.

Within three years, most EU and UK mid-to-large controllers will operate an Art 15 pipeline that is materially tighter than today's baseline. The sector that tightens first gains a defensible compliance posture; the sector that tightens last provides the case law that tightens everyone else.

For boards and compliance functions: what to do before the wave

The operational prerequisites are not complex. The prerequisites for documented-board-oversight are harder.

Verification standard as a policy decision. The management body should set, and document having set, the Art 15 verification standard the organisation operates. Email verification supplemented by out-of-band confirmation is the current European baseline. Identity-document requests should be the documented exception, not the rule, with a written justification of proportionality where used.

SAR intake as a SIEM-level signal. Anomalous SAR patterns — clusters from look-alike domains, high-frequency requests following a public identity-pack leak, SARs from jurisdictions where the controller has no data subjects — should trigger investigative review, not automatic processing. The same logic that applies to credential-stuffing detection applies to SAR-based reconnaissance.

Staff training as NIS2 compliance evidence. Analysts handling Art 15 requests should receive documented training on the forged-SAR pattern, the cross-reference attack, the current EDPB verification standard, and the specific national DPA guidance (for example, the AP's DPG Media standard for NL-facing operations). Training records populate the audit trail NIS2 Art 20 accountability requires.

Periodic SAR-handling audit. An external review of the organisation's Art 15 pipeline — sampled response accuracy, verification consistency, documentation completeness — produces the board-level evidence that management oversight was exercised. This is what a Corporate Audit would typically surface. For individual executives assessing their personal Art 15 exposure, The Eraser includes structured SAR work as part of the removal process.

Part 2 is coming

We publish the second half of this analysis in approximately eight weeks: a field study in which we submit Art 15 requests to 20 controllers across sectors and jurisdictions, tracking 30-day compliance, verification rigour, data completeness, and format quality. Methodology will follow Pavur's ethical frame: submissions made against the analyst's own identity or explicitly test identifiers, no third-party impersonation, all responses anonymised before publication.

If you operate an Art 15 pipeline and want to understand how your organisation compares to the archetype patterns described above, a Corporate Audit includes a structured review of the pipeline against the current European verification baseline.

Frequently Asked Questions

Is the subject access request really used as an attack channel?

Yes. James Pavur's 2019 research at Black Hat USA documented 150 SAR submissions using a forged email address; 24% of the responding companies returned sensitive personal data including passwords, home addresses, and payment card digits. More recent threat-actor behaviour builds on this by combining forged SARs with stealer-log-derived verification material to pass identity-proofing questions at subsequent controllers. The attack surface is sector-wide and has not been materially addressed since Pavur's disclosure.

What verification should controllers require for an Article 15 request?

The EDPB Guidelines 01/2022 state that requesting a copy of an identity document is generally inappropriate unless strictly necessary. The Dutch AP applies a three-tier hierarchy: a full ID copy is never permitted except where the law specifically requires it; a redacted ID copy is permitted only if no less-intrusive method is available; and controllers must first attempt less-intrusive methods using data they already hold — login systems, multi-factor confirmation via email or text, or specific identifiers such as account-number digits plus date of birth. Document-based verification is the fallback, not the default, and verification must be proportionate to the sensitivity of the data held and the potential damage a wrongful disclosure would cause.

Does NIS2 apply to subject access request handling?

Article 21(2)(h) requires basic cyber hygiene practices, and Article 21(2)(g) covers human-resources security and access control. A controller whose Article 15 pipeline leaks personal data to an attacker, where the management body cannot demonstrate documented oversight of the verification standard, is exposed under both GDPR and NIS2. NIS2 Article 20 makes management-body members personally accountable for oversight failures. The first prosecutions are expected 2026 through late 2027.

What is the difference between UK GDPR Article 15 and EU GDPR Article 15?

The provisions are substantively identical. The UK DPA 2018 adds a specific default behaviour for credit reference agencies: requests to CRAs default to a Limited Subject Access Request (LSAR) covering financial circumstances only, unless the requester specifies otherwise. Post-DUAA 2025, UK controllers also benefit from clarifications on when requests are "manifestly unfounded or excessive" and on the scope of internal-complaint handling before ICO escalation. The UK-specific analysis is in our UK GDPR data broker guide.

What happens when a SAR wrongly discloses personal data to the wrong person?

Three consequences. The wrongly-disclosed data subject can file a complaint with the supervisory authority (ICO in UK, AP in NL, etc.) and the controller faces an Art 32 (security) and Art 15 (access) compliance investigation. Under GDPR, fines can reach 4% of global turnover. Under NIS2, management-body members can be personally fined. The data subject can also pursue civil claims under Art 82 for material and non-material damage — the floor set by CJEU jurisprudence is low but non-zero.

How is this changing under DUAA 2025 in the UK?

The UK Data (Use and Access) Act 2025 took effect on 5 February 2026. For subject access requests specifically, DUAA clarifies the "reasonable and proportionate search" standard controllers must apply and introduces a new internal-complaints regime that becomes mandatory from 19 June 2026 — requiring controllers to handle complaints internally before the ICO will accept escalation. The effect is ambiguous: the reasonable-search clarification reduces controller burden, while the complaints regime adds procedural friction for data subjects. Both changes are still settling into operational practice.

Sources

This article is built from primary sources: regulator publications and guidance, published tribunal judgments, EDPB guidelines, legislation, and the privacy policies of the controllers analysed. Where analytical commentary is cited, it is professional legal or security analysis from named practitioners.

United Kingdom — regulatory and judicial:

European Union — regulatory:

Netherlands — Autoriteit Persoonsgegevens:

Spain — AEPD:

  • AEPD decision on Informa D&B (January 2025), €1.8M fine — ppc.land summary.

Primary security research:

Historical parallel sources:

Controller privacy policies analysed (checked 24 April 2026):

Related Service

Corporate Exposure Auditfrom €5,000

Consented executive exposure audits, corporate leak surface mapping, third-party vendor review, and quarterly security posture reporting.

Request a Proposal Talk to an Analyst

Share this briefing

If this was useful, sharing it helps others protect themselves. It also helps keep the intelligence briefings free.