The ‘Liveness’ Scan You Never Agreed To By Adeline Atlas
Jun 21, 2025
Biometric Bondage series: where we learn how anatomy is being linked to authentication in the AI era. I’m Adeline Atlas, 11-time published author, and today we’re looking at a company most people have never heard of—yet it’s quietly scanning millions of faces across borders, banks, and healthcare systems. The company is called iProov, and their flagship product is something called a “liveness scan.” It sounds harmless—until you realize what it actually does, how widely it's spreading, and why you were never really asked for your permission.
Let’s start with the basics. iProov is a UK-based biometric firm founded in 2011. Its technology is now used by governments, banks, telecommunications companies, and healthcare providers in over 20 countries. The core of iProov’s offering is a real-time face authentication system that claims to verify not just that it’s your face on camera—but that it’s really you, in real time, and not a deepfake, photo, or pre-recorded video.
This process is known as “Genuine Presence Assurance.” When you use iProov—whether to sign into a bank, access a medical portal, or verify your age on a website—the system flashes a soft light pattern onto your face, records your skin texture, depth cues, and tiny changes in light absorption, and uses AI to confirm that you're physically present. In just seconds, your device becomes a biometric checkpoint.
Now, let’s be clear. On the surface, this sounds like a smart tool in the fight against identity theft, fraud, and spoofing. And it is. But the deeper problem is how and where iProov’s tech is being used—often without your awareness, and without informed consent.
For example, in the United Kingdom, iProov is used by the National Health Service (NHS) to verify identity for digital patient records. It’s also used by the Home Office for immigration services. In the U.S., iProov signed contracts with the Department of Homeland Security to pilot border control applications. And in Singapore, it’s used for access to SingPass—the country's national digital ID system.
This isn’t opt-in. In many cases, iProov is the only path to verification. Want to access your health data? Want to file for immigration status? Want to prove your age online? Scan your face, or lose access. That’s the model.
And what’s even more alarming is how little transparency exists around how the data is stored, secured, or shared. iProov says it does not retain biometric data—but in many implementations, the system operates in tandem with local databases or private third parties. Once your face is captured and matched, the template may live far beyond the moment.
That brings us to the issue of data permanence. Unlike passwords, your face is not changeable. If a biometric template leaks, is sold, or is integrated into state surveillance systems, there’s no way to claw it back. You’re not just handing over a login—you’re handing over a part of yourself.
And the real concern isn’t just the scan—it’s the scope creep.
iProov has already partnered with companies like Mitek Systems, ID.me, and Jumio, forming a kind of invisible biometric backbone for dozens of apps and services. In many cases, you don’t know you’re using iProov. You just hold your phone up, and the verification happens. The facial scan isn’t branded—it’s embedded. That means users aren’t aware they’re being subjected to high-level biometric assessment. There’s no warning, no breakdown of what’s being captured, and no way to use the service without agreeing.
In fact, iProov was one of the main providers of facial verification for online COVID-19 testing access during the pandemic. Millions of users were asked to scan their face to verify test results, health status, and travel eligibility. This biometric infrastructure was accelerated under the banner of public health—but remains in place post-pandemic.
Let’s talk briefly about the company's language, because this is where things get slippery. iProov doesn’t market its product as surveillance. It uses terms like “frictionless authentication,” “passive verification,” and “user-centric design.” These are PR terms that disguise the reality: you are being scanned, recorded, analyzed, and scored—without physical contact, without full explanation, and without true consent.
There’s also the issue of false positives and algorithmic bias. Like many facial recognition systems, iProov has faced criticism for uneven performance across skin tones, lighting conditions, and facial structures. And when you're building a global biometric net that determines access to banking, healthcare, and government services, even a 1% error rate is unacceptable.
Now add in the possibility of real-time surveillance. Once your face is linked to a system like iProov’s, it can theoretically be used to track you across locations and systems. Combine that with CCTV, heat maps, or movement sensors, and you've created a non-consensual tracking ecosystem—one in which you can be recognized, logged, and profiled, not because you opted in, but because your face is your face.
So what are the risks? Let’s lay them out:
- Consent Erosion: The scan often happens silently or under soft coercion—use it or lose access.
- Data Permanence: Once your biometric template exists, it can’t be revoked. If compromised, it’s compromised for life.
- Function Creep: What starts as identity verification can become movement tracking, purchasing analysis, or behavior prediction.
- Privatized Surveillance: Government services are now relying on private companies like iProov to manage identity—removing accountability and obscuring responsibility.
- Global Integration: Biometric systems are becoming interoperable across borders. A scan used for healthcare in one country could be linked to immigration checks in another.
The takeaway here is not that iProov is the villain. It’s that biometric infrastructure is scaling faster than regulation, transparency, or public understanding. And while iProov may claim high security and limited storage, its partnerships, deployment models, and data pathways often fall outside user control.
So ask yourself: when did you agree to this?
Because if the answer is never—but the scan still happened—then we’ve already passed the point of biometric consent.
In the age of liveness detection, the only thing more valuable than your face… is the fact that you showed up. And the systems now in place will remember every time you do.