How to Defeat Facial Recognition: A 2026 Technical Guide
A 2021 GAO report found 18 or more US federal agencies use facial recognition. ICE scans people at borders, including minors, without ID requirements. The FBI's Next Generation Identification system holds more than 650 million photos. Clearview AI scraped 50+ billion facial images from social media and sells access to police and private investigators. As of 2026, real-time face matching from consumer hardware is no longer limited to governments.
Defense has three layers: shrink your indexed faceprint, break capture in public, and harden the digital systems that can link a face match to the rest of your life.
Start with threat model: A journalist at protests faces different risks than someone worried about casual OSINT. Pick the layers that match your threat, not every trick on the list.
How Facial Recognition Systems Work
Most systems follow the same pipeline:
- Detection: Find a face in the frame
- Landmark detection: Map 68–128 points across the eyes, nose, jaw, and lips
- Feature embedding: Turn that geometry into a compact vector, the faceprint
- Database matching: Compare the faceprint against stored embeddings
- Threshold decision: Return a match if similarity clears the confidence threshold
That pipeline shows where to attack: break landmark detection, cut your database coverage, or keep your faceprint out of searchable datasets. NIST's Face Recognition Vendor Test (FRVT) tracks vendor accuracy. Top systems now clear 99.5% accuracy at 1-in-1,000,000 false match rates.
Layer 1: Reduce Your Indexed Facial Footprint
Facial recognition needs something to match against. Fewer indexed photos means fewer chances to identify you.
Layer 2: Physical Evasion in Public Spaces
| Method | Effectiveness | Against what | Practicality |
|---|---|---|---|
| Hat + scarf covering | High | CCTV, mobile cameras | High (inconspicuous) |
| Sunglasses | Medium | Landmark detection | High |
| CV Dazzle makeup | Medium–High | Landmark-based FR | Low (conspicuous) |
| IR-blocking makeup | High (IR cameras) | Night surveillance cameras | Medium |
| Infrared LED glasses | High (some cameras) | Cameras with IR sensors | Medium |
| Avoiding camera sight lines | Highest | All systems | Situation-dependent |
CV Dazzle (Adversarial Makeup)
Adam Harvey's CV Dazzle project showed that asymmetric, high-contrast patterns can break many landmark detectors, especially when they disrupt the nose bridge and eye area. The core ideas are simple:
- Asymmetry breaks models that expect bilateral symmetry
- High contrast around the nose bridge hides a key landmark
- Patterns into the hairline can confuse face-boundary detection
Limitation: newer models train against adversarial examples. CV Dazzle still works better against older CCTV stacks than against newer AI cameras.
IR-Blocking Makeup and Accessories
Many surveillance cameras, especially in low light, use near-infrared illumination you cannot see. Standard makeup is partly transparent to it. IR-blocking cosmetics with high titanium dioxide or zinc oxide content can reflect or absorb NIR and wash out facial detail on IR cameras while looking normal in visible light.
Infrared LED Arrays
Glasses or hats with IR LEDs can flood the area around your face with infrared light and saturate some sensors. Researchers showed this years ago, and commercial products exist. Results depend on the camera. Systems with solid IR-cut filters will shrug it off. Cheap surveillance cameras often will not.
Layer 3: Digital Infrastructure
Face matching rarely stays on its own. Once you are identified, other data gets pulled in. Harden the systems that make that correlation easy.
- GrapheneOS: Your phone camera is part of the problem. Apps with camera access on stock Android can also pull location and metadataData about data, such as who contacted whom, when, from what device, and from which location. Metadata often remains exposed even when content is encrypted.Glossary →. GrapheneOS lets you deny camera access per app and monitor access attempts.
- Mullvad VPNA virtual private network encrypts traffic between your device and a provider-run server, hiding activity from local networks while shifting trust to the VPN operator.Glossary →: Stops your network activity from tying back to you as easily after a physical ID event.
- Anonymous eSIM: Carrying a normal phone at a protest gives your carrier a clean location trail. A Silent.link anonymous eSIM paid with XMR breaks that link.
- IMSI catcher awareness: Stingrays show up at protests and public events to identify phones. A Faraday pouch kills the RF signature. See counter-surveillance detection guide.
Jurisdiction Matters
The legal map keeps moving. As of 2026:
- EU: The AI Act bans real-time biometric surveillance in public in most cases. Enforcement is underway.
- US federal: No federal law meaningfully limits government facial recognition. More than 18 agencies use it.
- US state: 23 states have biometric privacy laws. Illinois BIPA remains the strongest. San Francisco, Boston, and Portland banned municipal facial recognition.
- UK: The Metropolitan Police uses live facial recognition at public events. Courts have pushed back on some deployments, but the systems remain in use.
Law trails the tech. If your threat is immediate, technical countermeasures usually matter more than waiting for courts.
For hardware counter-surveillance tools, see Counter-Surveillance: How to Find Hidden Cameras and Trackers and DIY Privacy Hardware.
Cunicula does not endorse illegal activity. All techniques described are legal in most jurisdictions when used defensively. Verify local laws before use. Affiliate disclosure.
Follow the Money
Facial recognition is a contract business. A small group of firms sells to police, airports, and private clients while federal oversight stays thin.
- Market size
- $5B global (2022) → $12B projected by 2028 (CAGR ~15%)
- Key players
- Clearview AI: $30M funding, Peter Thiel + Kirenaga, 3,100+ law enforcement clients · NEC · AWS Rekognition · Azure Face · Google Cloud Vision · PimEyes ($15/month, public)
- US government
- 18+ federal agencies · no statutory limit on use · FBI NGI: 650M+ photos, $1B+ investment · TSA biometric entry-exit at all major airports
Frequently Asked Questions
Does anti-surveillance makeup actually defeat facial recognition?
Sometimes. CV Dazzle-style makeup can break older commercial systems by hiding key landmarks around the eyes and nose bridge. High-contrast, asymmetric patterns work best. IR-blocking makeup can also disrupt near-infrared cameras used in low light. Newer models trained on adversarial examples resist this better. In public, the most reliable evasion is still physical: hats, scarves, and staying out of camera sight lines, plus reducing the number of indexed face photos online.
Which US government agencies use facial recognition?
A 2021 GAO report found 18 federal agencies using facial recognition, including the FBI, ICE, TSA, IRS, Secret Service, and State Department. ICE and CBP use it at borders. The FBI NGI-IPS system holds more than 650 million photos. Many state and local agencies also use Clearview AI or similar databases.
Can I opt out of Clearview AI facial recognition?
Sometimes, but the result is limited. Clearview AI offers opt-outs for private individuals in some states with biometric privacy laws. That can suppress results shown to the public, but law-enforcement users may still get matches. PimEyes has a simpler civilian opt-out. Neither removes your images from the underlying database. Long term, the real fix is posting fewer high-quality face photos online.
Does a VPN protect me from facial recognition?
No. A VPN hides your IP address and network traffic. It does nothing against physical cameras or face matching. For facial recognition, the tools are physical concealment, adversarial patterns, and database removal. For network surveillance, use a VPN. They solve different problems.
What legal protections exist against government facial recognition?
As of 2026, 23 US states have some biometric privacy law. Illinois BIPA is the strongest because it allows private lawsuits and has produced large settlements. Some cities, including San Francisco, Boston, and Portland, banned government facial recognition. In the EU, the AI Act bans real-time biometric surveillance in public spaces in most cases. Federal US agencies still face little restraint.