Trust Automation has selected Echodyne’s radar system as a primary component of its Small-Uncrewed Air Defence System (SUADS) counter-uncrewed aerial systems (C-UAS) platform, which is intended for the US Air Force (USAF).
Echodyne will assist Trust Automation with fulfilment of a $490m indefinite delivery indefinite quantity (IDIQ) contract awarded in August 2025.
Discover B2B Marketing That Performs
Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.
The SUADS platform features several configurations, including the Rapidly Deployable Small-Unmanned Air Defense System (RD-SUADS), Fixed Site Small-Unmanned Air Defense System (FS-SUADS), and Expeditionary Small-Unmanned Air Defense System (EX-SUADS).
All three support the Sensor Open Systems Architecture (SOSA) standard and now fully integrate Echodyne’s EchoShield radars, which serve as the initial SOSA-compatible radar merged within the system.
According to Echodyne, RD-SUADS is engineered for quick deployment, with a self-contained and self-powered design, sized to fit standard pallets for military airlift.
FS-SUADS is intended for permanent military sites and can operate independently or linked with others for redundant, comprehensive security.
EX-SUADS, which is designed exclusively for detection, is built for checked baggage transport and sized to fit inside larger SUVs.
Trust Automation CEO Ty Safreno said: “Integrating Echodyne’s advanced radar into our RD-SUADS, FS-SUADS, and EX-SUADS systems enhances both detection range and accuracy, enabling operators to identify threats sooner and respond with greater confidence.
“These systems provide agile, comprehensive C-sUAS capability that helps protect warfighters and critical facilities through elevated situational awareness.”
EchoShield provides medium-range radar functions, delivering comprehensive location data for a wide range of drone variations.
The company leverages standard interfaces and diverse data formats to supply foundational information. This data is used to guide optical sensors, coordinate effector options, and shorten response times.
The system’s classification uses recursive neural network (RvNN) machine learning to monitor motion and concentrate attention as required.

