Peraton Labs to help develop artificial intelligence (AI) and machine learning for unmanned ground vehicles

Nov. 30, 2023
LINC will update control laws in real time while providing guidance and situational awareness to the human operator or autonomous controller.

ARLINGTON, Va. – U.S. military researchers needed artificial intelligence (AI) systems that respond well to conditions and events that these systems have never seen before. They found their solution from Peraton Labs Inc. in Basking Ridge N.J.

Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., have announced a $9.3 million contract to Peraton Labs for the Learning Introspective Control (LINC) project.

LINC aims to develop AI- and machine learning-based technologies that enable computers to examine their own decision-making processes in enabling military systems like manned and unmanned ground vehicles, ships, drone swarms, and robots to respond to events not predicted at the time these systems were designed.

Peraton Labs actually won the DARPA LINC contract on 3 Nov. 2022. The contract announcement came on Tuesday in a post-award synopsis issued by the U.S. Naval Information Warfare Center-Atlantic in Charleston, S.C.

Related: Artificial intelligence and machine learning for unmanned vehicles

LINC technologies will update control laws in real time while providing guidance and situational awareness to the operator, whether that operator is human or an autonomous controller.

Today's control systems seek to model operating environments expected at design time. Yet these systems can fail when they encounter unexpected conditions and events.

Instead, LINC will develop machine learning and introspection technologies that can characterize unforeseen circumstances like a damaged or modified military platform from its behavior, and then update the control law to maintain stability and control.

A LINC-equipped platform will compare the behavior of the platform, as measured by on-board sensors, continually with a learned model of the system, determine how the system's behavior could cause danger or instability, and implement an updated control law when required.

Related: Wanted: artificial intelligence (AI) and machine learning to help humans and computers work together

This could be an improvement of today's approaches to handling platform damage, which places the burden of recovery and control on the operator, whether that operator is human or an autonomous controller.

LINC will help operators maintain control of military platforms that suffer damage in battle or have been modified in the field in response to new requirements. LINC-enabled control systems will build models of their platforms by observing behavior, learning behavioral changes, and modifying how the system should respond to maintain uninterrupted operation.

LINC should be able to detect disruptive changes in control response and quickly develop a control regime based not only on the learned model, but also on changes that take place after the model has been learned.

LINC focuses on two technical areas: learning control by using onboard sensors and actuators; and communicating situational awareness and guidance to the operator.

Related: Marines ask Sentient Vision for artificial intelligence (AI) and machine autonomy for unmanned reconnaissance

Learning control by using onboard sensors and actuators will perform cross-sensor data inference to characterize changes in system operation, rapidly prune possible solutions to reconstitute control under changed dynamics, and identify an area of nondestructive controllability by continually recalculating operating limits.

Communicating situational awareness and guidance to the operator involves informing the operator of changes in system behavior in a concise, usable form by developing technologies to provide guidance and operating cues that convey details about the new control environment and its safety limitations. LINC is a four-year, three-phase program.

Initial work involves an iRobot PackBot and a remote 24-core processor. This ground robot weighs 20 pounds; measures 26.8 by 15.9 by 7.1 inches; has tracked and untracked flippers; moves at 4.5 miles per hour, and operates in temperatures from -20 to 50 degrees Celsius.

Related: Navy emphasizing unmanned surface vessels (USVs) and artificial intelligence (AI) in Middle East operations

The remote processor has an Nvidia Jetson TX2 general-purpose graphics processing unit (GPGPU), dual-core NVIDIA Denver central processor, Quad-Core ARM Cortex-A57 MPCore processor; 256 CUDA software cores, eight gigabytes of 128-bit LPDDR4 memory, and 32 gigabytes of eMMC 5.1 data storage.

A key goal of the program is to establish an open-standards-based, multi-source, plug-and-play architecture that allows for interoperability and integration -- including the ability to easily add, remove, substitute, and modify software and hardware components quickly.

For more information contact Peraton Labs online at www.peratonlabs.com; DARPA at www.darpa.mil; or the Naval Information Warfare Center-Atlantic at www.niwcatlantic.navy.mil.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!