← Countermeasure Database

AI Vision Detection (EO/IR)

DetectionOpen-Source Verified

Passive electro-optical / infrared sensors paired with computer-vision classifiers that detect, track and classify drones from imagery without emitting any signal. Increasingly the backbone of modern C-UAS sensor fusion.

How It Works

Daylight EO and thermal IR cameras feed a CNN/transformer model trained on drone signatures. The model detects sub-pixel moving objects, classifies them by airframe, and cues kinetic or EW effectors. Operates entirely passively — no RF emissions to be geolocated by the adversary.

Technical Specifications

range
1–8 km depending on lens and weather
cost
$30,000–$300,000 per node
deployment Time
Hours (mast/vehicle), permanent (fixed)
crew Required
0–1 (semi-autonomous)
weight
5–40 kg per sensor head
power Requirement
100–400 W

Advantages

  • + Fully passive — undetectable by adversary RF intel
  • + Works against fiber-optic and autonomous drones (no RF needed)
  • + Day/night operation with cooled IR
  • + Scales horizontally with cheap sensor nodes

Disadvantages

  • Degraded by fog, heavy rain, dust, low cloud
  • Requires significant compute and trained models
  • False positives on birds without good classifier
  • Limited range vs. radar

Tactical Deployment Tips

  • Mesh multiple sensor nodes for 360° coverage
  • Cue kinetic effectors and EW from vision tracks to remain emissions-quiet
  • Train classifiers on local drone inventory for best accuracy

Limitations & Vulnerabilities

  • Weather-dependent
  • Line-of-sight only — terrain masks targets

Drones It Defeats

Drone types ranked by how well this system defeats them — tap any drone for details

⚠ How Adversaries Defeat This System

Active enemy adaptations observed in the field — distinct from passive limitations above

  • Fog, rain, dust degrade EO/IR range significantly
  • Very small or low-contrast airframes evade classifier at range

Sources & Further Reading

  • Anduril — Lattice / Sentry product documentation
  • Defense News — Computer vision in C-UAS (2024)