Sensor fusion uses COTS technology to integrate threat data and increase situational awareness

June 1, 2000
The trick for designers is to blend the right hardware with the right software to merge data from a broad range of sensors and synthesize it into readily understandable information for warfighters in the field

By John Rhea

The trick for designers is to blend the right hardware with the right software to merge data from a broad range of sensors and synthesize it into readily understandable information for warfighters in the field.

Click here to enlarge image

The ultimate mechanism for correlating disparate tactical data, as measured in sophisticated information-processing capability, is the human brain.

But that is only one side of the equation. The other side is the sensors, where the human eye and ear are limited to very narrow bands of the electromagnetic spectrum. The equation does not work unless the human eye, ear, and brain are in balance.

Humans in combat conditions, like the electronic equipment they fight with, are vulnerable to reliability and survivability problems. To compensate for this vulnerability, military researchers have put heavy emphasis on developing machines to gather and process warfighting information and give humans the information they need to perceive their battlefield situations accurately without putting human lives at risk.

These efforts, sponsored by the Defense Advanced Projects Agency (DARPA) in Arlington, Va., and the research arms of the U.S. military services, have spawned multispectral sensors — many of them distributed among weapons platforms — more powerful computers, inter- and intra-system data channels of ever-increasing bandwidth, and supporting computer architectures and software.

This has been an essentially evolutionary process involving upgrades to such proven technologies as the MIL-STD-1750A airborne computer and the MIL-STD-1553 databus, new sensors derived from DARPA-sponsored research in such advanced materials as gallium arsenide and indium phosphide, and predictable advances at the electronic component level that are yielding affordable commercial off-the-shelf (COTS) devices.

This graphic depicts how the Multimedia Analysis and Archive System from the General Dynamics Space and Surveillance strategic business unit processes tactical data.
Click here to enlarge image

The net result is readily available supporting technologies that military leaders continuously introduce into new generations of weapon systems and retrofit into existing systems.

An example of this is the U.S. Navy's Cooperative Engagement Capability, or CEC. This architecture enables deployed forces such as Aegis cruisers and P-3C patrol aircraft (and eventually E-2C airborne early warning aircraft) to share radar data to create a composite track of hostile forces. In this way combat commanders can expand their areas of detection and the detail of information about their potential targets.

CEC also circumvents a common problem where radar systems working alone can lose signals in ground clutter. CEC allows several different vessels to share radar data so all ships plugged into the CEC network can continue tracking targets even when the targets move outside the range of some radar sensors. Defense systems designers also are applying this concept to systems that defend against threats such as cruise missiles, and perhaps even theater ballistic missiles.

The CEC demonstrates that radar continues to be the primary tactical sensor to detect hostile forces. Yet the services are moving into another area of the electromagnetic spectrum — infrared — to develop more sophisticated and survivable sensors. While radar is essential, it is vulnerable to detection and jamming. The solution is to complement it with passive electro-optical sensors that are virtually immune to detection and countermeasures.

Windows of detection

There are only three windows in the electromagnetic spectrum through which sensors can detect targets. The most obvious (and least useful for tactical purposes) is visible light, or the portion visible to the human eye — the 0.5-to-0.7-micron region.

The other two are in the IR portion. The 3-to-5-micron band (known as mid-infrared) is especially good at detecting thermal radiation such as fires and the plumes of missiles. Still, shielding the heat source or employing other stealth technologies can easily fool sensors that operate in this region.

A better region is the 8-to-12-micron band (far-infrared), because sensors operating in this region can detect internally generated heat ranging from the body heat of individual soldiers to the heat accumulated by the skin of an aircraft as it travels through the atmosphere.

A U.S. Army soldier based in Germany works on a command, control, communications, and intelligence system that combines data from a variety of communications and surveillance sensors.
Click here to enlarge image

The ultimate goal of this focus on the IR spectrum is to convert the inherent radiation from potential targets into images that military commanders can discern in any kind of weather or lighting conditions. These electro-optical systems also offer greater accuracy in azimuth, elevation, and range compared to conventional radars.

U.S. Army leaders are engaged in a so-called "system of systems" integration effort as part of their battlefield digitization program. This effort, which has been in progress for more than a decade, seeks to automate five of the most fundamental components of land combat — maneuver, air defense command and control, intelligence/electronic warfare, fire support, and logistics/medical support.

Supervising the program are officials at the Program Executive Office for Command, Control, and Communications System (PEO C3S) at Fort Monmouth, N.J., and its test facility, the Central Technical Support Facility (CTSF) at Fort Hood, Texas. The effort supports the scheduled deployment in September of the Army's first digitized division, the 4th Infantry Division, also located at Fort Hood.

A critical element in this effort is an intensive fast-track spiral software development to circumvent the problems of traditional development cycles in which the software becomes antiquated by the time it reaches the battlefield. Spiral development allows new software versions to be designed, tested, refined, and deployed to the field in minimal time.

"Our mission involves the acquisition, development, and deployment of tactical computer and communications systems to integrate the best components from today's state-of-the-art command and control systems with new technology," explains Thomas Kerrigan, logistics management specialist at PEO C3S, which manages the system architecture for the digitization effort.

"Our chief challenge is to integrate the various C3 systems so that they can operate seamlessly on the battlefield, providing the required support to the warfighters," Kerrigan adds. This involves a wide variety of collaborative planning tools, including video teleconferencing, white boarding, and lower tactical Internet access. As a result, he notes, soldiers can use Internet browsers and electronic mail in their command posts to move documents that were previously carried via couriers.

American soldiers man a tactical operations center, where computers automate the tasks of overlaying important information on the relative whereabouts of friendly and hostile forces.
Click here to enlarge image

The digital conversion project is being supported by Robbins-Gioia, a program management company in Alexandria, Va., which evaluated the conversion project for PEO C3S. "We began our work with the development of a responsibility matrix," notes Scott Fass, general manager of Robbins-Gioia. "The matrix enabled the CTSF to view all resources and current responsibilities, so that they can make the most effective use of their available personnel, time, and resources."

A supporting technology derived from the commercial world that is finding great use in tactical situations is multimedia. The General Dynamics Space and Surveillance strategic business unit, based in Minneapolis, has begun installing a system to provide tactical analysts operating unmanned aerial vehicles, or UAVs, with live video, VHS tapes, digital images, digital documents, and scanned hard copy reports and photographs.

The system is called the Multimedia Analysis and Archive System (MAAS), and supported the Bosnia peacekeeping operation and Kosovo monitoring. It has been employed with the Hunter and Predator UAVs, the P-3C Orion maritime patrol aircraft, and Low-Altitude Navigation & Targeting Infrared for Night (LANTIRN) pods on tactical aircraft, says Joe Vaughan, chief technologist at General Dynamics Space and Surveillance.

Meanwhile, soldiers in the field process information they have gathered before feeding it to the tactical operations center.
Click here to enlarge image

The MAAS units, installed at Royal Air Force bases in England, U.S. Central Command locations in Saudi Arabia and Kuwait, and at the U.S. Army Center for Lessons Learned at Fort Leavenworth, Kan., use commercially available Microsoft Windows NT and Microsoft Office 97 software. The analysts receive digitized video in MPEG1 format on their personal computers. This not so much a computationally intensive activity as it is input/output intensive, Vaughan notes. A variety of output formats are available, including compact disks.

Entry-level configurations with integrated hardware and applications software cost about $66,000 apiece and are configurable, scaleable, and field upgradeable to support several different video input sources and multiple analysts. The idea is to produce what analysts call exploitation support data (ESD), such as look angle of the sensor, latitude, and longitude.

In a new application MAAS is part of the Global Hawk UAV to process National Imagery Transmission Format (NITF) images, from which ESD files store in the system's archive. If ESD are available with the imagery, analysts can query using ESD fields to locate imagery of interest. The track of the sensor and the platform overlay on a digital map in real time. This is essentially a COTS system, according to Vaughan, and one of the lessons learned to date is to continue to automate the reporting process.

Other forms of heterogeneous data that designers can integrate with readily available COTS hardware include signal intelligence, or sigint, automatic target recognition (ATR), and sonar, explains Rodger Hosking, vice president of Pentek Inc. of Upper Saddle River, N.J. His firm's new line of 32-channel digital receivers — the velocity interface mezzanine (VIM) module — uses top-of-the-line commercial microprocessors and accepts four analog RF inputs on the front panel.

Ruggedized graphics consoles such as this one from Barco Inc. are being modified to blend live radar and video on the same screen.
Click here to enlarge image

In a sigint application, Hosking notes, a cellular base station can extract the signal from the antenna signal. ATR is also possible using direction finding from several different channels. Sonar is bandwidth intensive, and phase coherent signals are necessary for sonar beam forming. Pentek's current system has a 2 MHz bandwidth, and Hosking says he expects to upgrade that to 20 MHz.

ILC Data Device Corp. (DDC) in Bohemia, N.Y., is a hybrid microelectronics house that uses versions of the venerable 1553 databus for a variety of sensor interfacing tasks, says Mike Glass, applications manager. The 1-megabit data rate is adequate for handling such disparate data as signals from acoustic pressure transducers, compressed video, and map data, he says.

One advantage of 1553 is its ability to rely on COTS boards, including the 15-megabit fiber optic versions. Among the users are the U.S. Air Force's F-22 fighter and the multi-service Joint Strike Fighter, or JSF. Nonetheless, FibreChannel is making inroads in such speed-critical applications as real-time video, radar, and infrared. Expected users include the U.S. Navy's F/A-18 Super Hornet fighter-bomber, the U.S. Air Force's B-1 strategic bomber, and the next design phase of the U.S. Army's AH-64 Apache Long Bow helicopter.

DDC officials are developing a PCI mezzanine card version of their 1553 conduction- and convection-cooled boards, with as many as four channels, using PowerPC microprocessors. Glass calls this his company's strategy to "move up the food chain." Beta testing has begun, and initial shipments are set for this summer, he says.

The other mainstay of military avionics — and hence of sensor fusion, among other functions — is the ubiquitous MIL-STD 1750 microprocessor-based "black box." This is also undergoing an upgrade under a modernization program for the U.S. Air Force's F-16 fighter now in progress at CPU Tech in Pleasanton, Calif.

Richard Comfort, vice president of the firm, says his goal is to transform the long-established 1750 technology into a "system on a chip" — but to do so economically. The major impediment to this metamorphosis has been the exorbitant cost of validating the new design in an embedded system, he adds.

That cost is based on what Comfort estimates is a requirement for a billion bytes of test code for the validation effort, which he says represents 2,000 programmer-years of work. CPU's approach was to automate that part of the job, in the process reduce the validation effort to 45.5 hours. Comfort says he will not describe the automation process in detail, but does say the company has been at it for 10 years.

The radically modernized MIL-STD-1750A microprocessor from CPU Tech is going into upgraded versions of the U.S. Air Force F-16 jet fighter.
Click here to enlarge image

The net result of the system-on-a-chip approach was to reduce the parts count by a factor of 80, CPU officials say. This automatically translates into an improved mean time between failure (MTBF), which in this case was an increase from 150 hours to an estimated 3,000 hours MTBF for the boxes in the APG-68 airborne radar.

CPU is about three months into an 18-month effort to produce the necessary modernization kits for the Air Force, and thus far Comfort says his engineers have reduced number of boards necessary from 43 to eight. Since this was a modernization effort the Air Force's requirements did not stress performance, but Comfort estimates that computational capability should increase by a factor of 10.

Where the whole sensor-fusion process comes together is at the displays, and this facet also has undergone evolutionary improvements. Rich Challen, director of graphics products at Atlanta-based Barco Inc., traces the history back to the late 1980s, when military systems designers began trying to combine live radar and video data on one screen.

This led to a series of supporting products, such as graphic controllers, interfaces for radar scan converters, frame grabbers, and other input means. This led in turn to what Challen describes as three generations of cards driving down costs through the use of COTS.

In the case of VME, which he calls "the package of choice for the military," this meant going from three cards with 45 slots to one card with two slots to one card and one slot. Barco leaders introduced a 6U VME version of their graphics controller, known as the advanced visualization system, in April. Barco designers are also working on PCI versions, particularly for the overseas market.

This is a currently available COTS technology, Challen says, and is known as the auxiliary video interface. This allows the insertion of radar data between the overlay and underlay graphics, enables users to place any map data beneath the radar image while placing track and target information on top. An application that he is particularly enthusiastic about is the proposed unmanned combat aerial vehicle, or UCAV, known in the early planning stages to deliver ordnance as well as to conduct surveillance missions.

Sensor fusion explained — sort of

An insight into the notion of sensor fusion is in the old folk story about the blind men and the elephant. Everybody knows this story: one blind man touches the trunk and says it's a snake, another touches a leg and says it's a tree, another touches the tail and says it's a rope, etc.

The problem with the blind men's approach is not necessarily the degraded quality of their data, but rather their inability to integrate the data. In fact, if they had modern sensor fusion systems available to them, in some cases they might have done better than people able to see.

In this story the elephant was probably cooperating for the most part; it was probably going about its business being an elephant oblivious to the efforts of the blind men to detect it.

But what if it had been a cardboard elephant, perhaps a decoy in a wave of incoming cruise elephants? Then, by sensing the target outside the visual portion of the spectrum, i.e., by touching it, the blind men could have come up with the correct answer when conventional detection methods failed. (In the case of elephants, an even better method might have been to use the sense of smell.)

That is what sensor fusion is all about — sensing targets across a broad portion of the electromagnetic spectrum and processing the huge amounts of threat data in real time. The result is enhanced confidence in target identification, which in turn permits a more timely response. — J.R.

'Unfusing' sensor data

AMHERST, Mass. — The opposite of integrating sensor data is segregating unwanted data. Designers at Primagraphics Ltd. in Cambridge, England, are working with the U.S. Navy to advance this approach for aircraft surveillance.

David Johnson, the company's technical support manager for North America, based in Amherst, Mass., relates a task that Navy officials faced in extracting identification, friend or foe (IFF) signals from the returned radar signals during patrol missions.

In cases where Navy and U.S. Coast Guard ships patrol adjacent coastal waters in search of drug runners, they need a way to distinguish between the cooperating targets, or "friendlies," emitting their own IFF signals and the non-cooperating aircraft, which may or may not be carrying contraband.

Under a cooperative program with the Naval Air Warfare Center in St. Inigoes, Md. (near the Patuxent River Naval Air Station), Primagraphics engineers installed a primary tracking radar on the ships to work with the center's secondary IFF tracker.

The two inputs combined in an overall situation display to enable the patrol personnel to eliminate the cooperating targets and thus focus their attention on the other radar tracks. The whole job was accomplished with relatively simple COTS hardware, Johnson says. — J.R.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!