METHOD

The Mosaic Effect: How Harmless Data Combines Into a Complete Profile

The mosaic effect is a concept from intelligence analysis. Individually, a piece of information may be harmless — your employer, your neighbourhood, your gym, your children's school. Combined with other harmless pieces, each one builds on the last to produce something qualitatively different: a complete profile of a person, their movements, their relationships, and their vulnerabilities.

Governments have applied this concept for decades to justify classifying information that would not, in isolation, warrant classification. The logic: the combination is sensitive, even when none of the parts are. The same principle shapes the investigative techniques covered in our Executive Digital Privacy hub.

The same principle applies to private individuals. Almost no one has applied it systematically.

The term entered legal discourse in United States v. Jones, 565 U.S. 400 (2012), in which the US Supreme Court considered whether attaching a GPS tracker to a suspect's vehicle constituted a search. The Court ruled that it did, but Justice Sotomayor's concurrence went further. She noted that aggregated location data — each individual ping innocuous — produces a comprehensive record of a person's associations, movements, religious practices, and private life. The concern was not any single data point. The concern was what the collection, over time, reveals.

The decision did not resolve the question fully. But it named the problem precisely: aggregation changes the nature of data.

The institutional proof

In January 2018, security researcher Nathan Ruser noticed something in a publicly available screenshot from Strava. The fitness tracking app had published a global heat map of routes logged by its users — every run, cycle, and walk, rendered as light on a dark map. In populated cities, the map looked as expected. In remote areas, bright patterns appeared where no settlement should be.

Those patterns traced the perimeter of classified military installations in Syria, Somalia, and Afghanistan.

Each data point was a personal fitness record. A soldier's morning run. Combined globally, they revealed the outlines of facilities that did not appear on any public map. No hack. No breach. Default public settings on a fitness app.

Strava has since changed its defaults. The lesson has not changed.

The individual proof

In July 2021, Monsignor Jeffrey Burrill, the general secretary of the United States Conference of Catholic Bishops — effectively the senior administrative official of the Catholic Church in the US — resigned after a Catholic news outlet published an investigation into his private life.

The outlet had obtained commercially available location data from a data broker. The data originated from apps installed on Burrill's phone. It was sold as anonymised. The publication cross-referenced location patterns — his workplace, his residence, gay bars he visited while travelling for work — against publicly known addresses and his schedule. No single data point was sensitive. The combination identified a specific person's private behaviour and ended his career.

No hack. No breach. No illegal access. Commercially purchased data, competently aggregated.

This is the mosaic effect operating at the individual level, with a documented outcome and a named subject.

The platforms you are using now

Venmo, the payment app, is public by default. In May 2021, BuzzFeed News found President Joe Biden's Venmo account in under ten minutes using the app's built-in search tool and public friends feature. Biden's individual transactions were not visible. His friend list was. From it, researchers mapped his social network: family members, senior White House officials, and their contacts. One account, public by design, produced a functional social graph of a sitting US president.

Biden is a public figure. The methodology works on anyone with a public Venmo account.

Google Maps records your location history by default on Android and on iOS when location permissions are granted. From 2016, US law enforcement used geofence warrants — court orders requiring Google to produce data on every device present in a defined area during a defined time window. The New York Times reported in 2019 on Jorge Molina, a 24-year-old Arizona man investigated for murder because Google data placed him near a crime scene during a bicycle ride. He was innocent. He had been there, cycling. The combination of his location and the event produced a picture that required an investigation to disprove.

By the time Google announced a change to its architecture in December 2023 — moving Location History from Google's servers to on-device storage — it had received tens of thousands of geofence warrant requests. The rollout completed in late 2024. The stated reason: the company would no longer be able to respond to geofence warrants. The implication: the data had been there, accessible, for years.

Your location history, if enabled, is a continuous record of everywhere your phone has been. Each entry is a timestamp and a coordinate. Combined, they record your home, your workplace, your medical appointments, your place of worship, your relationships, and your routines.

The personal version

Your name is public. Your employer is on LinkedIn. Your location is listed as a metro region — "Greater London Area," "San Francisco Bay Area," whatever your LinkedIn profile shows — which, combined with your employer and commute patterns, narrows your home address considerably. Your gym appears on Strava if your profile is not private. Your children's school is inferrable from a comment you left in a local Facebook group three years ago, now indexed by Google. Your car appears in the background of a photo you posted in 2022. Your political views are visible from the accounts you follow. Your holiday schedule is reconstructable from five years of Instagram stories.

None of these data points is sensitive. You would share any one of them without hesitation.

The chain connecting them does not require a sophisticated investigator. Your LinkedIn profile establishes your employer and job title. Your employer's website, combined with your name, returns a conference presentation or press mention that reveals your work email format. That email format, run against breach databases, surfaces a credential from a data leak — including your personal email address. That personal address, queried against data broker platforms, returns your home address, phone number, and address history. Your phone number, run through a reverse lookup, confirms the address and adds the names of relatives registered at the same location. None of these steps required anything other than public sources and commercially available data. Together, they establish your identity, location, professional history, and family structure in under an hour.

An investigator does not need a breach to build this. They need an afternoon and the right tools.

Why self-googling does not show you this

When you search your own name, you see what Google's algorithm surfaces for that query, ranked by relevance and authority. You do not see the aggregation of what a motivated person assembles across twelve sources — combining results, cross-referencing breach databases, mapping patterns over time.

The gap between those two things is the gap between your perceived exposure and your actual one. It is consistently larger than people expect.

The cost collapse

Assembling this picture once required institutional resources: a government, a corporation, a well-funded investigative team. That cost has collapsed.

Data broker aggregators — companies such as Acxiom, LexisNexis Risk Solutions, and Spokeo — have pre-assembled records on hundreds of millions of individuals, combining public records, purchase history, address history, and inferred demographics into structured profiles sold for cents per query. Breach databases, ranging from the publicly accessible Have I Been Pwned to deeper commercial repositories aggregating tens of billions of leaked credentials, have filled in what public sources missed: email addresses, account metadata, partial payment details, and IP address histories from thousands of separate incidents.

OSINT platforms consolidate these sources further. Tools used professionally in due diligence, background screening, and investigative journalism pull data broker records, social media profiles, corporate registry filings, and breach data into unified subject profiles. What previously required a team of investigators working across separate sources now takes one person and a search query.

The mosaic effect is no longer a concern limited to public figures or high-value targets. The cost of running this process against a private individual is low enough that the threshold for doing so has dropped from institutional decision to personal one.

GDPR Recital 26 defines personal data broadly: if identifying a natural person is reasonably likely, the data is personal and protections apply. The regulation is designed to capture aggregation.

In practice, the aggregation that creates identifiability happens across sources that each collected their piece lawfully, for separate purposes, with no relationship to each other. The fitness app, the payment platform, the professional network, the data broker, the breach database — none of them combined your data. An analyst did.

The regulation addresses data points at the point of collection. The mosaic is assembled afterwards, by parties the regulation does not directly reach. That gap remains open.

The conclusion

Privacy is not about whether any individual piece of information you share is sensitive. It is about the total pattern your sharing creates over time, across platforms, in contexts you have since forgotten about.

The question to ask is not "is this sensitive?" The question is "what does this add to the picture that already exists?"

Most people have never asked it. Most people have never seen the picture.

Related Service

The Mirror€595

A full audit of your digital exposure — breach records, data broker listings, social profiles, dark web presence, and more. Delivered in 48 hours.

Get The Mirror — €595 Or Get a Free Exposure Check

Share this briefing

If this was useful, sharing it helps others protect themselves. It also helps keep the intelligence briefings free.