Key Highlights
Questions and answers:
- Why is detect-and-avoid technology becoming more important in modern aviation? As airspace fills with autonomous and semi-autonomous vehicles, detect-and-avoid systems are essential for safely managing collisions without human oversight, particularly when aircraft operate beyond visual line of sight or in contested environments.
- How does radar contribute to next-generation detect-and-avoid systems? Radar provides reliable detection of aircraft and obstacles, even in poor visibility, and functions independently of cooperative signals like transponders, making it crucial for safe operation in both civilian and military contexts.
- What role does sensor fusion play in detect-and-avoid architectures? Sensor fusion combines data from radar, EO/IR cameras, lidar, and other sensors to create a more accurate, resilient picture of the environment, allowing autonomous systems to make safe, real-time maneuver decisions even under uncertain conditions.
Avoiding midair collisions always has been a foundational requirement of aviation, but that requirement is changing as airspace fills with autonomous, semi-autonomous, and optionally crewed planes operating across multiple domains.
From high-altitude intelligence, surveillance, and reconnaissance (ISR) aircraft to low-altitude uncrewed aerial systems (UAS) and emerging advanced air mobility (AAM) vehicles, next-generation aerospace platforms increasingly are expected to detect, classify, and avoid obstacles with limited or no human intervention.
For military operators, the stakes are higher still, with contested environments, degraded communications, and noncooperative targets undermining assumptions that have shaped traditional sense-and-avoid concepts. Non-cooperative air traffic does not use a transponder to transmit its position or identification to air traffic controllers.
Object detection and avoidance, commonly grouped under the term detect-and-avoid, have evolved well beyond their origins as a pilot-assist function. What once was framed as an alerting or advisory capability is now a tightly coupled system-of-systems challenge spanning onboard processing, sensor fusion, decision logic, and flight controls.
Of course, these essential functions must operate within strict size, weight, power, and cost (SWaP-C) constraints while meeting certification and safety requirements largely developed for crewed aircraft.
Early detect-and-avoid implementations have relied heavily on cooperative surveillance technologies such as transponders and Traffic Collision Avoidance Systems (TCAS). Those approaches assumed that nearby aircraft would broadcast accurate position and intent and that a human pilot would remain in the loop to resolve conflicts.
ATC challenges
Modern aircraft, particularly uncrewed systems, no longer can rely on those assumptions. They must account for noncooperative aircraft, terrain, obstacles, and dynamic threats that may not identify themselves or behave predictably.
For uncrewed aircraft operating beyond visual line of sight (BVLOS), detect-and-avoid has become a prerequisite for access to shared airspace.
Regulatory frameworks such as Europe’s U-space and similar initiatives have elevated detect-and-avoid from a desirable capability to an operational requirement. Military UAS face potential electronic attacks on their Global Navigation Satellite Systems (GNSS), such as the U.S. Global Positioning System (GPS), and must continue safe flight when datalinks to ground control stations are degraded or lost. Together, these constraints are driving renewed emphasis on onboard autonomy, resilient sensor architectures, and distributed processing.
Crewing status further complicates system design. In crewed aircraft, detect-and-avoid systems are intended to augment pilot situational awareness through alerting, cueing, or maneuver recommendations. In uncrewed or highly autonomous platforms, those same systems must progress from advisory roles to real-time control authority, making safety-critical decisions at machine speed.
As artificial intelligence (AI) and machine learning are applied to object detection, classification, and intent prediction, the boundary between perception, decision-making, and actuation continues to narrow.
No single sensor can meet all detect-and-avoid requirements across varying weather conditions, lighting environments, ranges, and target types. Radar, electro-optical and infrared (EO/IR) cameras, lidar, passive radio frequency (RF) sensing, and cooperative surveillance systems each offer distinct advantages and limitations. As a result, sensor fusion has emerged as a central design principle, enabled by advances in embedded computing, high-speed interconnects, and real-time signal and image processing.
As military and aerospace platforms move toward higher levels of autonomy, detect-and-avoid is shifting from an optional safety feature to a foundational capability. Its performance will shape not only how aircraft share airspace, but how future missions are planned, executed, and trusted in environments where human oversight may be limited or delayed.
Dishing on radar
Despite growing attention to AI and autonomy, some of the most consequential advances in detect-and-avoid begin with a technology that predates both: radar. As airspace becomes more crowded and less cooperative, radar has reasserted itself as the backbone of next-generation detect-and-avoid architectures because it remains uniquely capable of answering a fundamental safety question: what is out there, and where is it going?
Unlike cooperative surveillance systems that rely on transponders or network participation, airborne detect-and-avoid radar makes no assumptions about target compliance. It detects aircraft, terrain, and obstacles by sensing reflected energy, operating in darkness, clouds, precipitation, and electronic clutter. For military platforms operating in contested or degraded environments, that independence is an operational requirement rather than a design preference.
What has changed is radar’s form and function. Modern detect-and-avoid radars increasingly rely on compact, solid-state active electronically scanned arrays optimized for wide-area coverage rather than long-range targeting. These systems trade raw transmit power for field of view, update rate, and angular resolution, enabling continuous surveillance of the airspace an aircraft must protect. Electronic beam steering supports rapid revisits, track initiation on small or slow-moving targets, and graceful degradation when portions of the array are lost or jammed.
Advances in signal processing have proven as significant as hardware improvements. Detect-and-avoid radar must distinguish meaningful targets from ground clutter, weather, birds, and interference, often at low altitudes where the signal environment is most challenging.
Adaptive processing, track-before-detect techniques, and probabilistic tracking are increasingly used to maintain reliable situational awareness without overwhelming onboard computing resources or downstream decision logic.
The output of modern detect-and-avoid radar is no longer a raw radar picture. Instead, it is a set of confidence-weighted tracks suitable for fusion with other sensors and for use by autonomy systems that may be responsible for maneuver planning. That shift elevates radar from a situational awareness aid to a safety-critical subsystem, with implications for redundancy, fault tolerance, and integration with flight controls.
For uncrewed and optionally crewed aircraft, radar-based detect-and-avoid also represents a shift in authority. Rather than supplementing pilot judgment, radar increasingly feeds autonomy stacks that decide when to alert, when to maneuver, and when to cede control. This transition raises certification challenges, particularly when radar inputs directly influence flight control laws.
Visual sensing
Where radar detects presence and motion, EO and IR systems often are tasked with providing identity, intent, and context, particularly in low-altitude, cluttered environments.
High-resolution visible and infrared cameras can detect small aircraft, terrain features, power lines, cables, and obstacles that may fall below the effective radar cross section of compact airborne sensors. For uncrewed aircraft operating near infrastructure or in urban environments, that visual fidelity is not a luxury but a necessity.
Historically, the limitation of EO/IR-based detect-and-avoid was not sensing but interpretation. A single flight can generate vast amounts of imagery, far exceeding what human operators or rule-based algorithms can process in real time. That constraint is now easing as advances in onboard embedded computing and machine learning push perception directly onto the aircraft.
Neural networks trained on large datasets can identify aircraft, vehicles, birds, and obstacles with increasing reliability, while temporal models estimate motion and collision risk across successive frames. Rather than streaming raw video to the ground, modern EO/IR systems extract features, assign confidence scores, and generate tracks for fusion with radar and other sensors. This approach reduces latency and reliance on communications links that may be congested, denied, or deliberately attacked.
The strengths of visual sensing are tightly coupled with its vulnerabilities. Performance degrades in fog, heavy precipitation, and low-contrast conditions. Glare, shadows, and background clutter can challenge even well-trained models, especially in environments outside their training envelope. Unlike radar, which tends to degrade gradually, visual systems can fail abruptly.
For military aircraft, this brittleness drives architectural choices that emphasize sensor diversity and conservative confidence thresholds. EO and IR sensing is rarely relied on in isolation. Instead, it is paired with radar and other modalities to provide semantic understanding where conditions permit, while enabling fallback behaviors when confidence drops.
Despite these challenges, EO/IR-based detect-and-avoid continues to gain ground because it addresses hazards that radar alone cannot. Small drones, unlit aircraft, cables, and static obstacles often present limited radar signatures but remain visually prominent. In urban, littoral, and low-altitude operations, visual sensing is frequently the only practical means of detection and classification.
Carnegie Mellon's ViSafe
Building on radar and EO/IR capabilities, Carnegie Mellon University in Pittsburgh introduced its ViSafe system in 2025, a high-speed, vision-only approach to detect-and-avoid for resource-constrained aerial systems.
ViSafe integrates a multi-camera hardware prototype with an edge-AI learning framework designed under SWaP-C constraints to enable provably safe self-separation during high-speed flight.
ViSafe relies on a control barrier function (CBF) approach that encodes safety thresholds and modifies the vehicle’s nominal control input when potential violations are detected.
This architecture enables autonomous aircraft to navigate complex airspace safely, even at closure rates of approximately 90 mph, which was demonstrated in both real-world flight tests and digital twin simulations. The system handles diverse environmental conditions, including variations in weather and lighting, while maintaining guaranteed separation from other aircraft.
ViSafe demonstrates how vision-focused autonomy can complement radar and EO/IR sensors, offering a pathway to fully autonomous detect-and-avoid without relying on cooperative targets or ground-based oversight. Its hardware-in-the-loop (HIL) approach and multi-camera fusion module illustrate how perception, fusion, and control can operate in real time at the aircraft edge.
Comms-aided collision avoidance
Recent research on multi-UAV collision detection and avoidance highlights the growing role of communication-aided reinforcement learning in shared and contested airspace.
A study published in Biomimetic Intelligence and Robotics developed a two-stage curriculum reinforcement learning (CRL) framework in which UAVs first learn basic collision avoidance in simple environments and then adapt to dynamic, obstacle-laden scenarios.
By integrating perception with inter-vehicle communication, UAVs can anticipate collisions beyond the limits of onboard sensing alone. The system’s zero-shot transfer from simulation to real-world flights underscores the potential for scalable multi-agent detect-and-avoid systems that can handle dense traffic, high closure rates, and previously unseen scenarios.
This approach complements single-platform systems like ViSafe by enabling cooperative, decentralized decision-making, which is increasingly relevant to military swarm operations or congested urban air mobility.
Sensor diversity
The growing diversity of sensing modalities has placed sensor fusion at the center of modern detect-and-avoid architectures. Rather than treating individual sensors as authoritative, modern systems treat their outputs as probabilistic inputs. Fusion engines reconcile timing, geometry, and confidence across heterogeneous data streams, producing tracks and hazard assessments that are robust to partial loss, interference, or deception.
This shift has been enabled by advances in embedded computing, including multicore processors, graphics processing units (GPUs), and specialized AI accelerators that operate within the power and thermal constraints of aircraft. High-speed interconnects and deterministic middleware enable data to move between sensors, processors, and flight controls with bounded latency.
Increasingly, it extends into decision-making, where risk assessments are informed by sensor confidence, vehicle performance limits, mission context, and rules of engagement. In uncrewed systems, this logic must operate autonomously, often with limited opportunity for human intervention.
As autonomy expands operational envelopes, the time available to act is shrinking. Airborne systems must resolve conflicts in seconds at closing speeds measured in hundreds of knots. That reality places a premium on architectures that separate perception, risk assessment, and maneuver generation into layered functions, allowing autonomy to be constrained or expanded based on confidence and mission context.
Certifying trust
In detect-and-avoid, technical capability alone is insufficient. Systems must demonstrate predictable behavior not only under nominal conditions but also in degraded, ambiguous, and adversarial environments.
Many technologies that drive detect-and-avoid performance are inherently probabilistic. Sensor fusion weights confidence across inputs. Machine learning (machine learning) models infer object class and intent rather than calculating them explicitly. Even radar tracking increasingly relies on adaptive and statistical techniques. While these approaches improve detection and reduce false alarms, they challenge certification frameworks built around deterministic behavior.
As a result, assurance is shifting from proving exact outcomes to demonstrating bounded behavior across a wide operational envelope. Simulation, digital twins, and scenario-based testing have become essential certification tools, enabling developers to evaluate millions of synthetic encounters, including rare but safety-critical corner cases.
Architecture also shapes trust. Regulators and military authorities favor systems that degrade gracefully rather than fail abruptly. That preference drives designs with sensor diversity, cross-checking across modalities, and explicit fallback behaviors. In some cases, autonomy is deliberately constrained, reverting from automated maneuvering to alerting-only modes when confidence falls below a defined threshold.
Regulatory and certification challenges extend beyond deterministic safety. Autonomous systems must demonstrate resilience to jamming, spoofing, cyber-attacks, and sensor degradation. Frameworks increasingly emphasize bounded behavior, assessing probabilistic decision-making in terms of worst-case outcomes rather than exact predictions.
Digital twins, scenario-based evaluations, and physics-based simulations are now central to the approval process, particularly for military operations in contested airspace. Human oversight is tested not only in normal operations but in degraded or adversarial conditions, ensuring that confidence metrics, fallback strategies, and alerting mechanisms function as intended.
Urban operations
This year is shaping up to be pivotal for AAM operations in restricted airspace, as commercial players, including electric vertical takeoff and landing (eVTOL) companies such as Joby Aviation in Santa Cruz, Calif., and Archer Aviation in San Jose, Calif., work to certify their aircraft with the FAA and pursue Department of Defense opportunities.
AAM introduces detect-and-avoid challenges that differ in important ways from both traditional aviation and small uncrewed aircraft operations. AAM vehicles are expected to operate at low altitudes near dense infrastructure and in proximity to other aircraft during takeoff, landing, and transition phases, often in urban or suburban environments where obstacles are numerous and airspace is constrained. These conditions compress reaction times and reduce margins for error, placing new demands on onboard detect-and-avoid systems.
Unlike conventional fixed-wing aircraft, AAM aircraft frequently transition between vertical and forward flight, altering sensor fields of view, aerodynamic performance, and maneuvering capability within seconds. detect-and-avoid systems must maintain continuous awareness across these transitions, accounting not only for other aircraft but also for buildings, towers, cranes, cables, and ground vehicles. Many of these hazards are static or slow-moving yet difficult to detect reliably with a single sensing modality.
Radar plays a key role in AAM detect-and-avoid, particularly for detecting noncooperative airborne traffic beyond visual range and in degraded visibility. However, SWaP constraints often limit antenna aperture and transmit power, making radar performance highly dependent on signal processing and track management. As a result, radar is typically paired with EO and IR sensing to provide near-field resolution and semantic context during critical phases of flight.
Vision-based sensing is especially important in AAM because it can detect obstacles with minimal radar cross section, such as wires or small uncrewed aircraft. At the same time, urban visual environments challenge perception systems with cluttered backgrounds and variable lighting. This has accelerated interest in onboard, edge-based perception and fusion architectures that reduce latency and avoid reliance on continuous ground connectivity.
From a certification perspective, AAM highlights the challenge of reconciling autonomy with assurance. Many concepts of operation envision high traffic density, limited human oversight, and automated conflict resolution at vertiports or along defined corridors.
Regulators are therefore evaluating detect-and-avoid systems not only on detection performance but also on their ability to behave predictably under uncertainty, degrade gracefully as confidence drops, and clearly signal system state to pilots or supervisory automation.
In this context, detect-and-avoid is becoming a key enabler of AAM scalability. Its maturity will shape how quickly urban airspace can support routine, high-density operations and how confidently regulators, operators, and the public can trust autonomous or semi-autonomous flight in close proximity to people and infrastructure.
Ground-based detect-and-avoid
Of course, machine autonomy extends beyond the airborne domain. Ground vehicles operate in dense, unpredictable environments where obstacles are measured in centimeters and hazards may be concealed or adversarial. Autonomous combat vehicles, logistics convoys, and robotic aircraft must detect terrain features, stationary obstacles, and moving entities while accounting for occlusion, uneven surfaces, and rapid changes in line of sight. This challenge is compounded in urban and complex terrain, where pedestrians, small vehicles, and temporary structures create highly dynamic obstacles.
Lidar provides high-resolution 3D mapping at short range, enabling precise localization and obstacle detection. Radar adds robustness in dusty, smoky, and adverse weather conditions, while EO and IR cameras contribute to classification and situational understanding. Acoustic sensing and proprioceptive feedback further inform obstacle avoidance, particularly in low-speed or confined operations. Onboard processing systems integrate these inputs with vehicle control loops to execute safe maneuvers that comply with mission rules and tactical priorities.
Military programs, including autonomous convoy tests and urban combat simulations, are expanding the role of detect-and-avoid in ground operations. Vehicles increasingly rely on cooperative and noncooperative detection, with AI models predicting the movement of other vehicles, personnel, and potential threats. Conservative safety envelopes ensure that even when confidence is low, vehicles take measures to avoid collisions or unsafe maneuvers, preserving trust in autonomy for operational commanders.
Maritime detect-and-avoid
Maritime detect-and-avoid occurs in open yet complex environments. Surface and subsurface autonomous vessels operate with fewer fixed obstacles than ground vehicles but must contend with dynamic conditions such as waves, reflections, weather, and targets ranging from large ships to small craft with minimal signatures.
Radar remains the primary sensor for long-range surface detection, supplemented by EO, IR, and sonar for surface and subsurface awareness. International collision regulations dictate maneuvering logic, while military missions introduce tactical considerations that may conflict with purely safety-driven behavior. Maritime detect-and-avoid must balance safety, compliance, and operational objectives, often over extended time horizons. Trust is built through consistent, predictable behavior and robust sensor fusion.
Across air, ground, and maritime aircraft, detect-and-avoid has evolved from a discrete safety function into a core autonomy enabler. While physical environments differ, common design patterns have emerged: sensor diversity, probabilistic fusion, layered decision logic, and graceful degradation.
Airborne systems emphasize radar for noncooperative detection at range, augmented by EO/IR sensing for classification and near-field awareness. Ground vehicles prioritize lidar and vision for centimeter-scale resolution, with radar providing robustness in dust, smoke, and adverse weather. Maritime aircraft rely on radar as the primary sensor, supplemented by optical sensing and sonar. Across all three, sensor diversity is a prerequisite for graceful degradation in real-world conditions.
Fusion and decision-making link sensors to action. Autonomous systems must reconcile conflicting inputs, evaluate risk, and generate safe maneuvers within time-critical windows. For airborne aircraft, this may mean resolving conflicts in seconds; for ground vehicles, navigating dense terrain; and for maritime systems, complying with collision regulations while accounting for slow maneuvers.
Trust and assurance are the unifying requirement. Systems must behave predictably to humans and resiliently to uncertainty. Layered architectures, confidence estimation, and fallback behaviors provide bounded autonomy that enables high-performance operation without exceeding safety limits.
Viewed together, detect-and-avoid is less about solving three separate problems than about managing a common tension: enabling machines to perceive and act in complex environments without exceeding the limits of trust. As military and aerospace aircraft move toward higher levels of autonomy, detect-and-avoid is becoming the connective tissue that links sensing, decision-making, and assurance across air, land, and sea.
About the Author
Jamie Whitney
Senior Editor
Jamie Whitney joined the staff of Military & Aerospace Electronics in 2018 and oversees editorial content and produces news and features for Military & Aerospace Electronics, attends industry events, produces Webcasts, and oversees print production of Military & Aerospace Electronics.






