Inside DARPA’s search for an 'autonomous scientist' to support its researchers

PhonlamaiPhoto/Getty images

The Defense Advanced Research Projects Agency is on the hunt for foundational AI models that can aid in the scientific discovery process.

The powerhouse innovation agency within the Department of Defense is asking for help in developing artificial intelligence software to support scientists and their research efforts by creating an “autonomous scientist.”

The Defense Advanced Research Projects Agency issued an opportunity earlier this month intended to solicit AI research concepts from industry that can do for science what the technology is already beginning to demonstrate in computing. 

Dubbed an “autonomous scientist,” DARPA officials said that this tool is not looking to supplant internal researchers, but help them efficiently develop new working theories trained on diverse sets of data. 

“What we're trying to do is replicate the success that we've seen for automatic code generation,” Alvaro Velasquez, DARPA’s program manager for Foundation Models for Scientific Discovery, told Nextgov/FCW.  “Right now, software engineers and coders enjoy these tools from OpenAI and Microsoft that help automate the generation of code. We would like to come up with a tool that helps automate the process of scientific discovery.”

Ideally, DARPA’s autonomous scientist will be creative and learn to generate unique scientific hypotheses that take into account advanced aspects of experiments — like scaling and cost-reduction opportunities — as well as providing skeptical reasoning. In the contract, DARPA officials say that the final product should be “at least 10X better in scalability (problem size, data size etc.) and also in time efficiency.”

Velasquez reiterated that the autonomous scientist is a tool to help human researchers in both technical and creative ways.

“This is not meant to replace the human scientist,” he said. “This is meant to act as an aid to sort of automate the process of generating useful hypotheses and such.” 

Use cases for DARPA’s intended autonomous scientists are diverse. Some examples include applying the AI tool to research in climate modeling and materials discovery. The domain Velasquez cites in the contract is using the autonomous scientist to aid in protein folding in computational biology experiments. 

Proposals for this contract opportunity are intended to not just train an autonomous scientist to learn the vast data inputs required for advanced generative output, but to extrapolate novel and unique hypotheses to test based on the scientific parameters within various disciplines. 

DARPA’s autonomous scientist work is exploratory. Constructing foundational AI models –– large machine learning algorithms trained on large volumes of data that can handle diverse commands –– is a field still in its infancy, according to Velasquez. DARPA is overseeing several other AI exploratory efforts, part of the agency’s larger Artificial Intelligence Exploration grant and contract opportunities. 

Should the autonomous scientists result in a safe and useful AI tool, the winning foundational models could be used to solve other challenges at DARPA and beyond. Velasquez said that he would love to see it applied across other government research institutions, like the Air Force Research Lab and the Naval Research Lab. 

“I would love to see this in every government service lead if we are successful,” he said. “I envision that someday, all scientists will have an autonomous co-scientist helping them along the way.”