DARPA - Defense Advanced Research Projects Agency

09/27/2024 | Press release | Distributed by Public on 09/27/2024 08:29

Teaching AI What it Should and Shouldn’t Do

Thanks to the rapid growth of large language models (LLMs), artificial intelligence (AI) agents have quickly been integrated into many facets of everyday life - from drafting documents to generating artwork to providing research assistance. But verifying the accuracy or appropriateness of an AI's response is not always easy. For AI systems to be trusted partners with humans in situations where safe and ethical decisions are paramount, further work is needed to efficiently convey knowledge about human intent, laws, policies, and norms into logical programming languages, which an AI can understand.

To achieve this goal, DARPA announced its new Human-AI Communications for Deontic Reasoning Devops program, or CODORD for short. Deontics, a philosophical term, refers to obligations, permissions, and prohibitions. Devops refers to the combination of software development and IT operations, including development that continues during operations. CODORD seeks to enable communication of deontic knowledge from humans via natural language (e.g., spoken or written English, French, German, etc.) automatically into a highly expressive logical programming language. If successful, CODORD will vastly reduce the cost and time needed to transfer massive amounts of human-generated knowledge about obligations, permissions, and prohibitions into logical languages.

 

"The current process for transmitting deontic knowledge stored in someone's mind or in written documents into a logical language is very expensive and slow, because it requires specially skilled knowledge engineers trained in logic to work with experts in particular application domains," said Benjamin Grosof, CODORD program manager in DARPA's Defense Sciences Office. "This major roadblock, known as the 'knowledge authoring' or 'knowledge acquisition' bottleneck, prevents us from taking full advantage of advances in large language models and logical programming languages we've seen in recent years. We need new techniques that allow deontic knowledge to automatically be translated from natural human language into logical language for an AI to reason with. Removing this knowledge-authoring bottleneck is the goal of the CODORD program."

CODORD could bring significant benefits to military and civilian applications. It has the potential to enable automated deontic reasoning with high assurance, including about compliance with command orders, regulations, laws, operational policies, ethics, contracts, agreements, and strategies and plans. It could allow AI tools to become practical much more widely as part of decision support across a variety of crucial defense and commercial applications, such as: operations planning; autonomous systems; supply chain, contracts, and financial; health treatment guidance; confidentiality and transaction authorization; systems integration, modeling, simulation, and wargaming; and national intelligence analysis.

"We've already seen the high assurance and business value of AIs that logically reason about deontic knowledge in an example where banking and finance regulations were translated into logical language via current time-consuming and expensive methods," Grosof said. "A trader or bank compliance officer could ask the AI whether a proposed transaction of a certain dollar amount between two banks would be legally permitted versus prohibited. The AI then provided an answer together with a fully detailed explanation in natural language of how it arrived at that decision. Its explanation would be a logical proof of the answer, with citations from regulatory documents, that a compliance officer - lacking expertise in logic or programming - could easily understand and effectively review. Since regulations, laws, and policies often are updated, rescinded, or replaced over time, CODORD aims to enable agile updates simply by speaking new deontic knowledge or uploading revised written documents."

This capability could be especially useful in the military when it comes to conveying commander's intent.

"I could potentially in the future hand over the orders generation process or the mission analysis and design process to an AI," said U.S. Marine Corps Operational Liaison and Special Assistant to the DARPA Director Col. Robert Gerbracht. "I need to have the assurance that an artificial intelligence would be able to pass my intent on, in the spirit in which it was given and within the ethical, legal, and moral guidelines that I, as a commander, issued that guidance. I have to be able to trust an artificial intelligence if it's assisting me with that cognitive task, that it is going to do it the same way that I would do it, maybe even better if I'm tired, if I'm at a point of friction, or if I have multiple spinning plates at the same time that I have to keep up in the air. I have to make sure that if a technology like this is going to help me with the orders process and commander's intent, that my intent is not only unadulterated, but it is still understandable to the junior warfighter that it is issued to."

A CODORD Proposers Day for interested proposers will be held on Oct. 8, 2024. The special notice is now available on SAM.gov. DARPA anticipates posting the full Disruption Opportunity solicitation on SAM.gov in the coming weeks.