Mesh-networked acoustic AI that detects, classifies, and locates drones — including fiber-optic FPVs invisible to radar — with zero RF emissions.
Modern FPV drones — particularly fiber-optic guided variants — carry no electronic signature, defeat jamming, and fly below radar. The tools built to counter them were designed for a different era.
Fiber-optic FPVs carry no RF transmitter, emit no electronic signature, and are too small and low-altitude for conventional radar. They are effectively invisible to every detection system designed around electromagnetic emissions.
RF jamming defeats RF-linked drones, but broadcasts your location to every direction-finding receiver in range. In contested environments, transmitting to defend yourself makes you a target.
Most detection solutions require fixed power, internet connectivity, GPS, or centralized servers — none of which exist in a forward position. When the infrastructure fails, the sensor goes dark.
Every drone makes sound. Watchpoint's sensor nodes capture that signature, classify it on-device using a custom AI model, and triangulate position across the mesh — delivering bearing and location to the operator's map in real time.
Each node runs a trained classifier on a custom microcontroller board. Detection happens entirely on-device — no cloud, no latency, no single point of failure. Nodes are weatherproof, battery-powered, and backpack-portable.
Nodes synchronize detections across a self-forming mesh, computing time-difference-of-arrival across the array. No internet, no GPS, no infrastructure. The network routes around lost nodes automatically.
A web-based operator interface receives streamed bearing and position data from the sink node. Real-time display shows drone classification, bearing from each node, and estimated position overlaid on the map.
Designed for rapid deployment in contested forward positions. No technician required.
Place four sensor nodes around the area of interest. Nodes self-organize into a mesh — no configuration required.
Each node continuously samples audio. The on-board AI classifier identifies drone signatures in real time, even in high-noise environments.
Detection timestamps are synchronized across the mesh. Time-difference-of-arrival is used to compute bearing from each node and estimate target position.
The sink node streams position and classification data to a web-based tactical map. The operator sees bearing, estimated location, and threat class in real time.
In v2, the system closes the kill chain autonomously. When the mesh localizes a target, it cues an interceptor FPV drone, computes an intercept vector, and guides it to the target position — without human aiming.
Operators approve the intercept. The system executes it. No line-of-sight, no manual piloting, no delay.
The mesh computes the intercept vector from acoustic position data and uplinks it to the interceptor FPV without operator aiming input.
The interceptor receives continuous position updates as the target moves, correcting its flight path until terminal engagement.
Target acquisition is entirely acoustic — no radar lock, no RF seeker, no need for the target to emit any signal.
Intercept authorization remains with the operator. The system handles everything below that decision threshold.
We are building and validating in parallel. Every subsystem goes from breadboard to field in weeks, not quarters. The roadmap below reflects what is operational today and what ships next.
Custom AI model trained on drone audio signatures running on the microcontroller board. Classification latency under 100 ms.
OperationalMicrocontroller-based sensor board with acoustic front-end, on-device inference, and mesh radio interface. Prototype builds in active testing.
ActiveMulti-node time-synchronization and TDOA-based bearing triangulation across the sensor array.
In ProgressWeb-based operator display showing real-time bearing lines, position estimate, and classification from the sink node.
In ProgressMesh-to-interceptor cueing, continuous position uplink, and guided FPV intercept without operator aiming.
UpcomingWe are building selectively. If you operate in contested environments, source drone countermeasure technology, or invest in early-stage defense hardware, we want to hear from you.
For procurement inquiries, operational demonstrations, investment conversations, or technical collaboration — reach us directly.
Defense units and integrators with operational context. Investors with a track record in deep-tech or dual-use hardware. Engineers in embedded systems, acoustics, or drone countermeasures.