Naval Research Lab, Space Dynamics Lab, and Office of Naval Research partner to test autonomous multi-target, multi-user tracking

Aug. 29, 2011
WASHINGTON, 29 Aug. 2011. Officials from the Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL), with the support of the Office of Naval Research (ONR), tested an autonomous, multi-sensor, motion-tracking and interrogation system that automatically finds moving objects and presents high-resolution images of those objects, while requiring no human input.

Posted by Courtney E. Howard

WASHINGTON, 29 Aug. 2011. Officials from the Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL), with the support of the Office of Naval Research (ONR), tested an autonomous, multi-sensor, motion-tracking and interrogation system that automatically finds moving objects and presents high-resolution images of those objects, while requiring no human input.

The wealth of intelligence, surveillance, and reconnaissance (ISR) information being gathered in the field can overwhelm human operators, as well as limit their ability to generate intelligence reports in optimal timeframes. This new, multi-user tracking capability enables the system to manage collection of imagery without continuous monitoring by a ground or airborne operator.

“These tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects,” says Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section, “a job typically requiring multiple sensors.”

Flight tests performed in March 2011 included multiple real-time tracks generated by a wide-area persistent surveillance sensor (WAPSS) that were autonomously cross-cued to a high-resolution, narrow field-of-view (NFOV) interrogation sensor via an airborne network. The sensors were networked by the high-speed Tactical Reachback Extended Communications, TREC, data link provided by the NRL Information Technology Division, Satellite and Wireless Technology Branch.

“The demonstration was a complete success,” says Dr. Michael Duncan, ONR program manager. “Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialized conditions.”

The network sensing demonstration employed sensors from other ONR sponsored programs. For example, the interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. The EyePod dual-band, visible near-infrared, long-wave infrared sensor is mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms. The test’s wide-area sensor was the mid-wave infrared nighttime WAPSS (N-WAPSS) with a 16-megapixel, large-format camera able to capture single frames at four hertz (cycles per second) and a step-stare capability with a one-hertz refresh rate.

The system, using precision geo-projection of the N-WAPSS imagery, tracked all moving vehicle-size objects in the FOV in real time. The data was converted to geodetic coordinates and transmitted by an air-based network to a cue manager system, which autonomously caused the EyePod to interrogate all selected tracks for target classification and identification.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!