Key Highlights
- Sensor fusion combines data from multiple sensors like radar, IR, lidar, and electro-optical systems to create comprehensive battlefield pictures in real time.
- Edge AI processes sensor data directly onboard platforms, enabling autonomous decision-making and reducing reliance on remote data centers.
- SWaP-C optimized embedded platforms integrate high-performance computing, AI acceleration, and secure connectivity into compact, rugged systems.
NASHUA, N.H. - Modern defense and aerospace platforms generate data at rates that were unimaginable just a decade ago. A single aircraft or uncrewed system can carry multiple sensors, each capturing streams of information simultaneously. Fighter jets, uncrewed aerial vehicles, satellites, and airborne radar systems collect rich data from electro-optical and infrared cameras, synthetic aperture radar, lidar, and inertial navigation systems. The volume of data available for analysis grows with every new sensor added to a platform.
While this proliferation of data provides a potential edge in battlefield awareness and target engagement, it also creates a significant challenge. Commanders and analysts must sort through torrents of data in near real time to make decisions that affect operations and human lives. Traditional approaches that rely on human analysis alone cannot scale to meet this challenge.
Chris Cuifo, president and chief technology officer at General Micro Systems (GMS) in Rancho Cucamonga, Calif., notes that "Recent industry articles have reported on US commanders commenting on the overabundance of so much sensor data that no human could ever make sense of all of it, much less coalesce it into a meaningful result in near real time that might affect the optempo." He emphasizes that the key advance yet to be realized is using artificial intelligence (AI) effectively in embedded systems rather than just running concepts on rackmount servers in data centers, explaining that "Warfighters fight in-theater, not in a simulator."
Sensor fusion and edge AI
One of the most visible examples of sensor fusion in action is the Lockheed Martin F‑35 Lightning II. The F‑35’s avionics suite combines data from a variety of sensors, including advanced active electronically scanned array (AESA) radar, electro-optical distributed aperture systems, and integrated communication, navigation, and identification systems. These sensors are fused by the aircraft’s mission computers to form a single tactical picture that the pilot can act on rapidly.
Related: DARPA seeks high-assurance artificial intelligence proposals under CLARA program
"What each service is doing is adding more sensors, connecting them together and to any available asset - such as an F-35 or C130J - and then using some sort of AI processing to interpret the data into 'actionable intelligence,'" GMS' Cuifo says.
A key enabler of modern sensor and signal processing is edge AI, which executes artificial intelligence algorithms directly on the platform where data is collected rather than relying on remote data centers. This allows systems to process sensor inputs in real time, make immediate decisions, and reduce the need to transmit massive amounts of data across potentially contested or bandwidth-limited networks. By bringing AI to the edge, embedded computing platforms can deliver rapid, autonomous, or semi-autonomous capabilities while meeting strict size, weight, power, and cost (SWaP-C) constraints.
Kevin So, vice president of Microchip Technology’s communications business unit, describes the trend in aerospace and defense as a move toward secure, autonomous, and distributed edge systems capable of operating in even the harshest environments. According to So, "In aerospace and defense, the dominant trend is toward secure, autonomous, and distributed edge systems capable of operating in every environment, even the harshest ones. The biggest operational impacts in recent years have come from edge intelligence, sensor fusion, and hardware acceleration. One of the most impactful advances has been the shift to edge AI, enabling real-time image and signal processing directly onboard aircraft, satellites, and unmanned systems without relying on ground stations."
This shift allows platforms and modern uncrewed intelligence, surveillance, and reconnaissance (ISR) systems to interpret radar, electro-optical, and infrared imagers, and lidar data directly at the edge, reducing latency and improving operational responsiveness. Improvements in high-dynamic-range and low-noise sensors, along with event-based vision technologies, have significantly enhanced detection performance in low-visibility and high-speed environments.
Dan Mor, director of customer solutions at Aitech in Chatsworth, Calif., explains that high-performance edge computing enables "real-time processing of data from multiple sensors and decision-making capabilities in systems at the edge; ingest, process, and act on vast amounts of sensor data; can help detect patterns, anomalies, and potential threats." Embedded GPUs and neural processing units allow for sophisticated image and signal processing on compact, rugged platforms suitable for operational environments with strict SWaP-C constraints.
Long-endurance ISR platforms
Sensor-rich platforms such as Northrop Grumman's RQ‑4 Global Hawk high-altitude, long-endurance (HALE) UAS illustrate the challenges of high-volume data processing. Global Hawk carries EO/IR, synthetic aperture radar, and signals intelligence payloads. These systems generate vast amounts of data that must be filtered, prioritized, and analyzed, often in near real time.
Wide Area Motion Imagery (WAMI) systems, such as Constant Hawk, provide persistent surveillance over large areas using high-resolution optical arrays. Modern WAMI systems increasingly rely on onboard image processing to detect changes, track movement, and reduce the data burden before transmission to command centers. High-performance edge computing allows these systems to ingest, process, and act on vast amounts of sensor data in real time, helping detect patterns, anomalies, and potential threats.
SWaP-C constraints
As sensor arrays proliferate and AI becomes embedded in signal chains, embedded computing has become a mission-critical element. So emphasizes that embedded platforms must consolidate real-time control, AI/ML acceleration, and high-speed networking in a single, hardened system-on-chip (SoC). This approach reduces physical footprint and power draw while increasing computational density. The ability to perform teraflops of computation in a single device allows system designers to shrink sophisticated platforms without sacrificing operational capability.
Related: Samantha McGrail joins the Military & Aerospace Electronics team
"Embedded computing is no longer just a support component; it is the mission's 'brain,'" So says. "The challenge has always been the SWaP-C trade-off: you could have high performance or high reliability, but rarely both in a low-power envelope. Modern designs are moving away from multi-chip solutions toward integrated, heterogeneous architectures. Embedded computing platforms are addressing this by consolidating diverse processing tasks—real-time control, AI/ML acceleration, and high-speed networking—into a single, hardened SoC. This drastically reduces the physical footprint and power draw while increasing the computational density. In an industry where every gram and milliwatt counts, the ability to perform teraflops of math in a single device is what allows us to shrink sophisticated platforms without sacrificing mission longevity."
With mission reliance on interconnected technologies, SWaP-C optimization is a top priority. Mor points out that the industry must "deliver tech with much lower and more sustainable power consumption and costs while maintaining overall performance." Systems need to provide higher CPU/GPU/AI processing performance combined with high-speed connectivity while keeping power consumption reasonable to withstand harsh environmental conditions. This allows embedded platforms to accelerate AI-driven workloads efficiently while maintaining power efficiency.
Open standards
Open-architecture standards such as Sensor Open System Architecture (SOSA) and OpenVPX have transformed the design of embedded processing modules. Mor notes that the industry is "moving to open standards for fast fielding, easy upgrades and interoperability/lifecycle support." Alignment with these standards is becoming a de facto mandatory requirement for most new high-end mission computers. Designing technologies in full alignment with open standards creates a common framework to enable rapid system integration, multi-vendor interoperability, and future scalability across defense platforms.
The Modular Open Systems Approach (MOSA) is a Department of Defense-endorsed framework that encourages the use of standardized, modular hardware and software interfaces in defense systems. Its goal is to enable rapid upgrades, multi-vendor interoperability, and easier lifecycle support by avoiding proprietary, single-vendor designs. By breaking systems into well-defined, interoperable modules, MOSA allows components to be swapped or upgraded without redesigning entire platforms, reducing costs and increasing flexibility over time.
Building on MOSA principles, the SOSA focuses specifically on mission computing and sensor systems. SOSA defines a set of standardized hardware form factors, interfaces, and protocols to ensure that sensors, mission computers, and processing modules can work together seamlessly across different platforms. This alignment enables defense programs to quickly integrate multiple vendors’ technologies, supports edge computing and AI workloads, and promotes scalable, high-performance sensor processing across airborne, ground, and naval systems.
Together, MOSA and SOSA provide the foundation for modern, open-architecture defense electronics, enabling mission systems to evolve more rapidly in response to emerging threats and technological advances while maintaining compatibility with existing platforms.
AI/ML in sensor pipelines
AI and machine learning (ML) are now integrated into sensor and image processing pipelines to automate tasks such as object detection, target classification, movement tracking, and noise reduction. So notes that "AI and machine learning are now built into signal and image processing systems to help with tasks like detecting objects, classifying targets, tracking movement, and reducing noise." Running models directly on edge devices allows for faster decision-making without sending all data to a central system, which is especially important for airborne and satellite platforms.
Technical and operational limitations remain. Cuifo highlights that reliance on tactical network connectivity could create vulnerabilities: "Even in rudimentary battlefield operations, warfighters are taught to fight without radio contact with other assets—should the RF spectrum be jammed or simply unavailable due to terrain or equipment failure. Now imagine how this problem multiplies if a tactical internet link was required for the equipment to execute the battle plan."
Operational resilience
Maintaining operational resilience in contested environments is essential. Cuifo points to examples in Ukraine, where forces have adapted to disrupted communications by employing resilient, low-cost technologies and fallback procedures. This ensures missions continue even when command-and-control networks are compromised.
Time-sensitive networking (TSN) and deterministic data delivery, as Mor describes, ensure that sensor data can be processed predictably in real time, even in harsh operational conditions. These capabilities are critical for platforms supporting safety- and mission-critical decision-making.
Future outlook
The continued integration of edge AI, high-performance computing, and open architecture standards is expected to reshape sensor, signal, and image processing in military and aerospace applications. Developers will create smaller, more powerful embedded platforms capable of executing complex analytics in real time. Open standards will enable modularity, interoperability, and lifecycle flexibility across defense systems. As technologies mature, operational resilience in denied or degraded network environments will remain a priority. Advances in edge intelligence, sensor fusion, and hardware acceleration will equip commanders and warfighters with tools to gain situational advantage and make timely decisions even in complex and contested operational environments.
Trending products
So explains that Microchip Technology has introduced new products that directly reflect the trends in embedded sensor, signal, and image processing with AI built into the edge, as well as secure, low-latency computing for aerospace, defense, and other mission-critical applications.
"One key example is the PIC64HX, a new family of high-performance, multicore 64-bit RISC-V microprocessors (MPUs) designed for intelligent edge computing. These MPUs combine open standards-based vector-capable AI/ML processing, integrated Time-Sensitive Networking, and defense-grade security - including support for post-quantum cryptography - to enable real-time inference, sensor fusion, and deterministic networking directly on embedded platforms without needing centralized processing."
Cuifo from GMS noted the company’s X9 AI Mission Computers are a high-performance, rugged embedded platform designed for AI and sensor processing at the tactical edge. It is powered by an Intel XEON W CPU with 8 cores and 24MB of smart cache, operating at 2.6GHz with turbo speeds up to 4.7GHz, and supports up to 128GB of ECC DDR4 RAM and 16TB of high-performance SSD storage. This combination enables real-time processing of multiple high-speed sensor streams, including electro-optical, infrared, and radar data, supporting sensor fusion and AI inference workloads in demanding environments.
The system offers high-speed connectivity with four 100GigE ports and multiple ThunderBolt 4 connections, along with options for 5G, Wi-Fi 6, Bluetooth 5, and HD GPS. Expansion is supported through several Rugged Mezzanine Carrier sites, enabling additional storage, networking, or AI acceleration for specialized missions. These features allow the X9 to connect with a wide variety of sensor and communication systems, providing reliable, low-latency processing for real-time decision-making.
Mor from Aitech states that the company’s SOSA-aligned mission computers are built for high-performance edge computing, AI, machine learning, and real-time sensor data processing across ground and aerospace military applications. These systems are designed to handle complex, high-bandwidth workloads at the edge, enabling quick decision-making and lowering dependence on centralized processing resources.
The company is upgrading its A230 AI GPGPU system to meet the specific high-speed and deterministic needs of modern defense and aerospace electronics. Improvements include 10GbE and USB 3.x interfaces, a 1GbE TSN endpoint for predictable, low-latency data transfer, and four high-resolution HD-SDI video inputs. These updates enable the system to process real-time, high-quality video from multiple sensors while ensuring fast and reliable data transfer between sensors, processing nodes, and external mission systems.
The integration of TSN ensures deterministic and synchronized data delivery, which is crucial for safety- and mission-critical aerospace and defense platforms. By combining reliable connectivity with AI acceleration and GPGPU processing, Mor states that these mission computers offer a flexible, scalable infrastructure for edge-based sensor fusion, situational awareness, target detection, and advanced analytics in operational environments where reliability and low latency are vital.
About the Author
Jamie Whitney
Senior Editor
Jamie Whitney joined the staff of Military & Aerospace Electronics in 2018 and oversees editorial content and produces news and features for Military & Aerospace Electronics, attends industry events, produces Webcasts, and oversees print production of Military & Aerospace Electronics.
Voice Your Opinion!
To join the conversation, and become an exclusive member of Military Aerospace, create an account today!

Leaders relevant to this article:




