Emergency Hotline: Call 1-844-363-1423 (United We Dream Hotline)
ICE Encounter

Overview

Facial recognition technology (FRT) has transitioned from experimental to active enforcement tool, deployed extensively at U.S. borders and within the interior.


CBP Traveler Verification Service (TVS)

System Architecture

TVS is a secure, cloud-based facial biometric matching service deployed across air, land, and sea environments.

How It Works

  1. CBP creates "pre-staged gallery" of expected travelers
  2. Sources: passports, visas, prior DHS encounters, passenger manifests
  3. Live photograph captured at checkpoint
  4. 1:N matching against gallery (one-to-many)
  5. If no gallery match: 1:1 matching against travel document (one-to-one)

Accuracy vs. Capture Performance

Metric Target Actual Status
Matching accuracy Goal 98% correct matches ✅ Exceeded
False positive rate Goal <0.1% ✅ Met
Capture rate 97% Failed ❌ Abandoned

Capture Failures

GAO documented reasons for capture failure:

  • Camera outages
  • Incorrectly configured gate systems
  • Airline agents reverting to manual boarding
  • Families and wheelchair users processed manually

CBP abandoned the 97% capture requirement because airline participation is voluntary.

Data Retention by Citizenship

Population TVS Cloud Retention Permanent Storage
U.S. Citizens ≤12 hours Deleted (if opt-out exercised)
Non-Citizens ≤14 days Transferred to IDENT/HART (75 years)

Privacy Notice Failures

GAO (GAO-20-568) found:

  • Opt-out signage frequently obscured
  • Notices outdated or absent
  • Citizens unaware of right to request manual screening
  • Consent framework "meaningless for average traveler"

ICE Mobile Fortify

System Description

Mobile Fortify is a mobile application developed by NEC enabling field-based facial recognition.

Capabilities

  • Runs on government-issued smartphones
  • Captures faceprints and contactless fingerprints
  • Instantaneous cross-reference against databases
  • Access to 200+ million images (DHS, FBI, State Department)

Privacy Concerns

No Consent Mechanism

From Privacy Threshold Analysis:

"ICE does not provide the opportunity for individuals to decline or consent to the collection and use of biometric data/photograph collection"

Data Retention

  • Captured photographs: 15 years
  • Fingerprints: 15 years
  • Regardless of match result
  • Regardless of citizenship status

PIA Deficiency

Mobile Fortify was deployed without a dedicated Privacy Impact Assessment.

ICE claimed "existing privacy documentation is sufficient," relying on broad Enforcement Integrated Database (EID) PIA.

Civil liberties organizations argue: EID PIA assesses collection on known investigation subjects, not arbitrary public encounters.


NIST Accuracy Studies

FRVT Program

The National Institute of Standards and Technology conducts the Face Recognition Vendor Test (FRVT) program (now FRTE/FATE).

Landmark Study: NISTIR 8280

FRVT Part 3: Demographic Effects analyzed:

  • 189 commercial algorithms
  • 99 developers
  • 18.27 million images
  • 8.49 million individuals

Key Finding: Demographic Differentials

False Match Rates (FMR) vary drastically—sometimes by factors of 10 to 100+—across demographic groups.


Documented Demographic Bias

By Race/Region of Origin

Population False Match Rate
West/East African Highest
East Asian High
Eastern European Lowest

By Gender

Population Finding
Women Consistently higher FMR than men

By Age

Population Finding
Children Elevated FMR
Elderly Elevated FMR
Middle-aged adults Lowest differentials

U.S. Mugshot Datasets

Population False Match Rate
American Indian Highest
African American High
Asian High

Understanding Error Types

False Non-Match Rate (FNMR)

  • Algorithm fails to match images of same person
  • Heavily dependent on image quality
  • Caused by: poor lighting, extreme angles, motion blur

False Match Rate (FMR)

  • Algorithm incorrectly matches different people
  • Occurs even with pristine photographs
  • Caused by: anatomic facial similarity
  • Catastrophic in enforcement context

Consequences of False Positive

  • Wrongful detention
  • Erroneous visa denial
  • False implication in criminal activity
  • Deportation of wrong person

"Other-Race Effect"

NIST observed that algorithms often exhibit bias reflecting training data demographics:

  • Algorithms developed in China showed lower error rates for East Asian faces
  • Algorithms developed with majority-white training data showed higher errors for non-white faces

Implication: Bias is often embedded in algorithm development, not inherent to the technology.


Real-World Performance Concerns

U.S. Commission on Civil Rights (2024)

The USCCR briefing report warned of "external validity" issues:

Algorithms tested using controlled NIST databases may perform substantially worse in real-world conditions:

  • Suboptimal lighting
  • Motion blur
  • Low camera resolution
  • Street-level encounters (ICE interior operations)

State DMV Photo Searches

ICE Access to Driver Photos

FOIA documents reveal ICE and FBI routinely request:

  • State DMVs run facial recognition against license photo databases
  • States include: Utah, Vermont, Washington
  • Searches occur without warrants
  • All state residents subjected to biometric surveillance

Consequence: Legal protection of driver's license weaponized against undocumented populations who complied with state driving laws.


Implications

For Individuals

Understanding FRT deployment helps:

  1. Know where facial capture occurs
  2. Understand opt-out rights (U.S. citizens at airports)
  3. Recognize data retention periods
  4. Assess risk of field encounters

For Advocates

Documentation supports:

  1. Challenging facial recognition evidence
  2. FOIA requests for specific deployments
  3. Policy advocacy against unregulated use
  4. Litigation on demographic bias

Related Resources