Maxence Boels

Artificial Intelligence Researcher

Autonomous Drone Defense System

#COUNTER-UAS #SWARM-DEFENSE #COMPUTER-VISION #AUTONOMOUS-NAVIGATION #CRITICAL-INFRASTRUCTURE

Project Overview

The Autonomous Drone Defense System is a vision-based Counter-UAS swarm solution designed to protect critical infrastructures from hostile drone threats. Developed during the SF25 Hackathon (November 15, 2025) in collaboration with Parv Kapoor (Carnegie Mellon University), this system implements a hybrid centralized-decentralized architecture for detecting and intercepting adversarial drones.

The project addresses real-world threats exemplified by recent European airspace violations and strategic operations, providing a robust defense mechanism immune to RF jamming through computer vision-based reactive policies.

System Demonstration

Multi-Drone Patrolling

Real-time object detection during multi-drone patrolling with centralized Bird's Eye View command center

Autonomous Interception

Decentralized autonomous drone navigation demonstrating detect and intercept capabilities using proportional navigation guidance

System Architecture

Hybrid Autonomous Swarm Defense

The system implements a three-layer defense strategy:

  • Patrolling: Multi-drone patrolling utilizing real-time object detection
  • Command: Centralized Bird's Eye View (BeV) command center for information aggregation
  • Intercept: Decentralized autonomous drone navigation for detect and intercept missions

Technical Stack

  • Visualization: Scene BeV using OpenLayers for geographical coordinates and high-level logic
  • Simulation: Cloud-based photorealistic simulation using Cesium for geolocated maps
  • Multi-Agent Support: DJI Matrice + Freefly Astro platforms
  • Edge AI Models: OwlV2 (object detection) and ZoeDepth (depth estimation)
  • Deployment: PX4 deployment plugin for real-world operations
  • Scalability: Compute scales with on-premise or cloud resources

Ground Control System

The operator interface features a dual-panel system with pixel streaming for live feeds and an OpenLayers-based map displaying drone positions, terminal access roads, and threat indicators. Real-time diagnostics include frame rate monitoring and recording capabilities for post-mission analysis.

Onboard Navigation Policy

Vision-Based Detection + Proportional Navigation

1. Perception System

  • Object Detection: Pre-trained OwlV2 model generates bounding boxes for target identification
  • Depth Estimation: Pre-trained ZoeDepth model provides relative depth from monocular vision

2. Proportional Navigation (PN) Guidance Law

Adapted from missile guidance technology, the system maintains a constant line-of-sight angle (λ) to ensure successful interception. The guidance equation is:

γi+1 = N(λi+1 - λi) + γi, i = 0,1,2,...

Where N is the navigation constant (typically N=3), tracking changes in Line of Sight (LOS) angles to achieve a collision triangle geometry.

Development Approach

The project started with a simple proportional navigation implementation, validated through extensive simulation using rerun.io for telemetry visualization and depth perception analysis. The system visualizes target 3D positions, velocities, and real-time bounding boxes with depth heatmaps for operator awareness and debugging.

Performance & Capabilities

System Advantages

  • RF Jamming Immunity: Vision-based reactive policy operates independently of radio communications
  • Hybrid Architecture: Combines centralized command with decentralized execution for resilience
  • Intuitive Interface: RTS-game style operator interface for efficient command and control
  • Zero-Shot Deployment: Sim-to-real pipeline enables rapid deployment without extensive field testing

Technical Features

  • Real-Time Processing: Edge AI models run at 30fps on drone hardware
  • Photorealistic Simulation: SOTA simulation environments for policy training and validation
  • Multi-Agent Coordination: Scalable swarm architecture supporting diverse drone platforms
  • Data Visualization: Comprehensive telemetry and debugging through rerun.io integration

Key Innovations

Vision-Based Guidance

Unlike traditional RF-based systems vulnerable to electronic warfare, our vision-based approach using OwlV2 and ZoeDepth ensures reliable target tracking and interception even in contested electromagnetic environments.

Proportional Navigation for Drones

Adapting classical missile guidance principles to quadcopter dynamics, achieving efficient interception trajectories while accounting for drone maneuverability constraints and real-time visual feedback.

Hybrid Command Architecture

Balancing centralized situational awareness with decentralized autonomous execution, enabling both coordinated swarm behavior and individual drone resilience to communication disruptions.

Sim-to-Real Pipeline

Seamless transition from photorealistic Cesium-based simulation to PX4-powered real-world deployment, minimizing field testing requirements while maintaining performance guarantees.

Development Timeline

SF25 Hackathon (November 15, 2025)

Rapid prototyping and demonstration of core capabilities including multi-agent simulation, vision-based detection pipeline, proportional navigation implementation, and ground control interface within hackathon timeframe.

Future Enhancements

  • Extended swarm coordination algorithms for multi-target scenarios
  • Advanced threat classification and prioritization
  • Integration with existing air defense systems
  • Field testing and validation at critical infrastructure sites
  • Enhanced autonomy for GPS-denied environments