Military researchers ask industry to use dialogue to help warfighters trust artificial intelligence (AI)

Nov. 27, 2023
FACT will explore how humans communicate in natural language with AI systems without overtrust that reveals implicit assumptions among partners.

ARLINGTON, Va. – U.S. military researchers are asking industry to find ways of enhancing the accountability of artificial intelligence (AI) to enable enabling accountable decision-making in complex environments for a variety of military applications.

Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., issued a presolicitation (DARPA-PA-23-04-02) last week for the Friction for Accountability in Conversational Transactions (FACT) project.

FACT will explore how humans communicate in natural language with AI systems without overtrust that reveals implicit assumptions between dialogue partners, and enables accountable decision-making.

AI today is easy to use with natural language. The problem, however, is this approach can lead warfighters to use AI recommendations uncritically without considering unintended consequences.

Related: Artificial intelligence and embedded computing for unmanned vehicles

The FACT program assumes that AI can be developed with sufficient human/machine discussion to reveal critical biases and assumptions that could lead to bad decisions in strategic planning, intelligence analysis, and reconnaissance.

Dialogue is central for human teams to solve complex problems where at each stage they understand each other’s intentions, assumptions, and accountability. This also can enable warfighters can uncover flawed reasoning and change assumptions when all information necessary is not available in advance.

Currently, there are no systems that use dialogue to promote trust and accountability to ensure that solutions meet considerations not always enumerated at the start.

Instead, the FACT effort seeks to explore, develop, and evaluate human-AI conversation-shaping algorithms that capture mutual assumptions, views, and intentions based on dialogue history; auto-assess the consequences of potential actions and reveal costs and assumptions for critical analysis.

Related: Unmanned submarines seen as key to dominating the world’s oceans

FACT seeks to develop a proof-of-concept in applications like robotics, intelligence, surveillance, and reconnaissance (ISR), and mission planning.

FACT will be an 18-month effort divided into two phases: a focus on algorithm development and feasibility studies; and a detailed evaluation of proofs-of-concept in a military application.

Companies interested should upload responses to the DARPA BAA portal online at https://baa.darpa.mil no later than 14 Dec. 2023.

Email questions or concerns to Matthew Marge, the DARPA FACT program manager, at [email protected]. More information is online at https://sam.gov/opp/c0113c4db961438c9e2fbcb7860d509c/view.

About the Author

John Keller | Editor

John Keller is editor-in-chief of Military & Aerospace Electronics magazine, which provides extensive coverage and analysis of enabling electronic and optoelectronic technologies in military, space, and commercial aviation applications. A member of the Military & Aerospace Electronics staff since the magazine's founding in 1989, Mr. Keller took over as chief editor in 1995.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!