Army considering multispectral sensor fusion to help helicopter pilots land in bad conditions

July 10, 2015
FORT BELVOIR, Va., 10 July 2015. U.S. Army electro-optics experts are reaching out to industry to find companies able to develop multispectral sensor fusion with a distributed aperture system (distributed aperture system) to help helicopter pilots fly fog, dust, smoke, darkness, and other degraded visual environment (DVE) conditions.
FORT BELVOIR, Va., 10 July 2015. U.S. Army electro-optics experts are reaching out to industry to find companies able to develop multispectral sensor fusion with a distributed aperture system (distributed aperture system) to help helicopter pilots fly fog, dust, smoke, darkness, and other degraded visual environment (DVE) conditions.

Officials of the Fort Belvoir, Va., branch of the Army Contracting Command at Aberdeen Proving Ground, issued a request for information this week (W909MY-15-R-C020) for a market investigation for multispectral sensor fusion development.

The integrated system would consist of a forward-looking sensor suite for pilotage in degraded visuals, fused with a distributed aperture system with spherical coverage for situational awareness.

The concept involves fusing outputs from a long-wave infrared camera, a light direction and ranging (lidar) sensor, and a radar with existing terrain and image databases to produce a head-tracked image to help the pilot fly in brownout, fog, smoke, rain, and other bad visual conditions.

The system must be interoperable with state of the art Army aviation display technology, balancing performance with small size, weight, and power consumption.

Related: Three companies to develop synthetic-vision avionics to help land helicopters in choking dust

This program will develop a sensor fusion engine, and the distributed aperture system. Teams proposing an integrated solution that addresses both efforts have the advantage.

The distributed aperture system effort involves providing a head tracked view of the region of interest of several sensors covering a spherical field of view. The ultimate goal is to develop a system for Army helicopters.

story continues below

The visualization should be compatible with a head-tracked helmet-mounted display with a 1x magnification fused synthetic image that aligns with the real world scene on a see-through display. Ultimately the Army wants the system to work with the Joint Common Architecture (JCA) and Future Aviation Capability Environment (FACE).

The system should help Army helicopter pilots estimate navigation information from the fused sensor data by estimating velocity from changes in imagery over time, by matching measured 3D lidar and radar data to digital terrain elevation data to estimate position.

Related: Sierra Nevada to flight test synthetic vision to help helicopter pilots land in zero visibility

The distributed aperture system, meanwhile, should support situational awareness and architectural growth to multifunction threat warning capability. The system will integrate several sensors covering a spherical field of regard with a processor and algorithms that provide a seamless, head-tracked view of any portion of the imagery.

At a minimum, the distributed aperture system must include processor and algorithms to create an image sphere from the sensor outputs, and to provide a head-tracked view to an helmet-mounted display.

Companies interested should submit white papers online no later than 10 Aug. 2015 online to https://safe.amrdec.army.mil/SAFE2/. Email questions or concerns to the Army's Brian Thomas at [email protected], and copy Sabin Joseph at [email protected].

More information is online at https://www.fbo.gov/notices/7cc2286c5dbd06dcface2a00a2771863.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!