DARPA looks to fund AI research for helping mechanics and medics
The Defense Advanced Research Projects Agency released a solicitation last week for artificial intelligence research that sounds like it’ll be Alexa or Siri on steroids.
DARPA is looking to fund research projects focused on “perceptually enabled task guidance,” according to the broad agency announcement.
A user -- DARPA uses the example of a medic or a mechanic -- would wear sensors that would allow an artificial intelligence agent to see and hear what the person is doing. The agent could then offer information and instruction using augmented reality.
“The goal is to enable mechanics, medics and other specialists to perform tasks within and beyond their skillsets by providing just-in-time feedback and instruction for physical tasks,” DARPA wrote in the notice.
Military personnel are faced with increasing complex tasks. Mechanics must repair and service highly sophisticated equipment. Medics are asked to perform more procedures over longer periods of time, according to DARPA.
“The goal of the PTG (perceptually-enabled task guidance) program is to make users more versatile by expanding their skillset and more proficient by reducing their errors,” DARPA wrote.
Outfitted with sensors and an augmented reality headset, the human will send and receive data and information.
DARPA describes three scenarios. One: the human can initiate the interaction by asking "What do I do next?" and then receive instructions. Second: if the human makes a mistake, the AI can issue a warning and suggest remedial action. Third: when a task is new, the AI can walk the human through the various steps of the task.
DARPA is calling the AI an “assistant” and the human is referred to as the “user.”
The agency is looking to exploit advances in deep learning, automated reasoning, and augmented reality. But the technologies themselves aren’t enough, so DARPA wants novel approaches and integrated technologies that address four key problems:
Knowledge transfer -- the assistants need to automatically acquire task knowledge from materials intended for humans such as checklists, manuals and training videos.
Perceptual grounding -- Objects, settings, actions, sounds and words recognized by the assistant must align with the terms used to describe tasks. This is needed so observations can be mapped to task knowledge.
Perceptual attention -- Assistants mush pay attention to percepts that are relevant to the task, while ignoring extraneous stimuli. They also must respond to unexpected events that may alter the user’s goals or suggest a new task.
User modeling -- Assistants must determine how much information to present to the user and when to do so.
“These four problems are not independent of each other,” DARPA wrote.
The agency is looking for proposals in two technical areas.
Area one is for fundamental research into the four key problems. Area two is to demonstrate technologies that address the four key problems in a militarily relevant scenario in one of three broad areas -- mechanical repair, battlefield medicine, or pilot guidance.
The BAA awards will be for four years and broken into a pair of two-year phases. Phase one will cover area one and phase two will cover area two.
DARPA expects to make six awards for area one worth a total of $30 million, followed by a pair of awards in area two worth $10 million.
A “proposers’ day” is scheduled for March 18 with abstracts due March 31 and final proposals having a deadline of May 14.
Posted by Nick Wakeman on Mar 10, 2021 at 1:47 PM