Clearview AI — The Facial Database That Never Forgets By Adeline Atlas
Jun 20, 2025
Clearview AI has built what is arguably the most powerful—and most controversial—facial recognition database in history. With over 30 billion images scraped from the public internet, this private company holds the biometric signatures of a staggering portion of the global population. And most people have never heard of it. You didn’t sign up for it. You didn’t consent to it. But if you’ve ever uploaded a photo to Facebook, LinkedIn, Instagram, or Twitter, Clearview likely has your face on file.
Today, we’re taking you inside the rise of a company that turned our digital lives into a biometric panopticon. This isn’t theoretical. Law enforcement, immigration agencies, and private corporations are using this tool right now to scan, match, and monitor human beings—often without their knowledge or legal recourse.
Let’s begin with the basics. Clearview AI was founded in 2017 by Hoan Ton-That, a software engineer and entrepreneur. His platform uses web scraping—automated bots that comb public websites for data—to harvest billions of publicly available photos, including images from social media, news articles, personal blogs, and video stills. Each image is run through a facial recognition algorithm that converts the visual into a mathematical model—a faceprint.
That faceprint is then added to a massive internal database. When a Clearview client uploads a new image—say, a security camera screenshot—the system runs it against the database, producing likely matches, links to online profiles, and other associated metadata. In seconds, a person’s identity, social connections, and online presence can be reverse-engineered from a single snapshot.
Unlike traditional law enforcement databases, which are limited to mugshots or government-issued IDs, Clearview AI’s database includes faces of regular people—students, doctors, journalists, activists—anyone whose photo exists online. This shifts facial recognition from a tool of criminal investigation to a tool of total identification.
And Clearview is already in widespread use. Over 3,000 law enforcement agencies in the U.S. have had access to its platform, including the FBI, Department of Homeland Security, and local police departments. Immigration and Customs Enforcement (ICE) has used it for deportation investigations. Private companies—like retail stores and stadium security contractors—have tested Clearview to prevent shoplifting or screen for blacklisted individuals.
But here's where it gets more concerning: you can’t opt out. Because Clearview pulls from public platforms, it circumvents the need for consent. If your face is online, it’s fair game. Even if you delete your accounts, Clearview retains the images it already captured. There is no “erase me” button. And the company has resisted calls to make its database searchable to the public—meaning only authorized users can see who’s being tracked.
Let’s walk through a real example. In 2019, police in Florida used Clearview to identify a suspect caught on blurry surveillance footage. They uploaded the grainy image, and Clearview returned a list of social media profiles with matching facial geometry. One match led to an Instagram account that showed the suspect wearing the same shoes as in the footage. That connection led to an arrest.
Now ask yourself: If you were caught on camera in a crowd, could your identity be pulled the same way? With Clearview, the answer is yes.
In 2020, a New York Times investigation exposed the scope of Clearview’s operations. Journalists tested the system with their own faces and were stunned to see how many photos came back—including ones they didn’t remember posting. One reporter’s matches included images from Flickr, YouTube thumbnails, and obscure blog articles. The facial algorithm had identified them across platforms, time zones, and even stages of aging.
Clearview claims it only sells access to law enforcement and government entities. But leaked documents show trials and demo accounts offered to large retailers, private security firms, and even hedge funds. And while the company insists it doesn’t track people in real time, some clients have reportedly used it to scan live security feeds, essentially enabling passive, continuous identification in public spaces.
Now let’s examine the technical power of Clearview’s engine. According to the company, its system can identify faces from partially obstructed images, blurred video stills, and photos taken at poor angles. The matching isn’t based on your image alone—it’s based on how your face moves through the internet. One photo at a wedding, one at a protest, one in a college yearbook—all connect to form a profile. Each angle strengthens the algorithm’s confidence. This is not just face ID—it’s lifelong traceability.
What about regulation? That depends on where you live. In the U.S., there’s no federal biometric privacy law. A handful of states—like Illinois and California—have biometric privacy acts (BIPA and CCPA) that require informed consent for face scans. Clearview has faced lawsuits under these laws, and in some states, has agreed to limit its services to law enforcement only. But enforcement is slow, and the legal terrain is uneven.
In Canada, the privacy commissioner ruled in 2021 that Clearview’s practices violated federal privacy laws and ordered the company to stop collecting images of Canadians. Clearview responded by geofencing Canadian IP addresses—but this does not stop cross-border use by American agencies scanning Canadian faces.
In Europe, Clearview was hit with multi-million dollar fines under the GDPR. France, Italy, and the UK have all ruled that Clearview’s scraping violates data protection principles, especially around consent and purpose limitation. But again, enforcement is complex. The company doesn’t operate servers in Europe, and regulatory bodies struggle to enforce cross-border bans.
Now let’s look at potential abuses of this technology.
- Political Targeting: Activists and journalists fear that Clearview could be used to identify and suppress dissent. If police scan a protest crowd with body cams, they could create instant lists of attendees.
- Corporate Blacklisting: Imagine a store using Clearview to scan faces at the door, flagging known shoplifters—but also flagging whistleblowers or ex-employees. There’s little oversight on how private users deploy the tech.
- Stalking and Harassment: If access to Clearview were ever leaked or hacked, abusers could use it to track victims. One photo from across a bar could reveal a person’s name, address, workplace, and family connections.
- Racial Bias and Misidentification: Like many facial recognition systems, Clearview has shown higher error rates for non-white faces. Several wrongful arrests in the U.S. have been linked to misidentification via facial recognition software.
Clearview says it’s solving crimes. But we must ask: At what cost?
Facial recognition is not neutral. It is a weapon of identification, often wielded without checks, transparency, or accountability. When your face becomes a searchable query, you are no longer anonymous in public. You are a walking data point. A trackable node in a surveillance web.
And the real question isn’t whether Clearview is legal or helpful. The real question is: Should it exist at all?
Because once a private company amasses the faces of an entire population, power shifts. The ability to move through life unseen, untagged, or unscanned disappears. We lose not just privacy—we lose freedom of assembly, freedom of thought, and the basic safety of moving through a crowd without being cataloged.
If the police had to get a warrant to tap your phone, why can they scan your face without one?
And if you never gave permission to be in a biometric database, why is your face already there?
The age of anonymous citizenship is ending. And Clearview AI is accelerating that ending at an industrial scale. It’s no longer a matter of if your face is in the system. It’s a matter of how often it’s being used—and who’s watching on the other side of the screen.
Your face was never meant to be your barcode. But now it is.