DARPA deal brings geospatial analytics to the cloud

DARPA and Maxar are collaborating to bring more geospatial analytics to the cloud as technologies and customer desires drive new business models.

The Defense Advanced Research Projects Agency has awarded a follow-on contract to Maxar Technologies for continued testing of an unclassified cloud computing environment the company created for analytics functions.

But the Geospatial Cloud Analytics hub is not just a pathfinder program for DARPA and Maxar to enable analysts to apply collected Earth imagery and other data at scale, although that is certainly one main aspect.

GCA also is another example of almost simultaneous changes in the technologies and business models for facilitating the work, plus customer appetite for new tools such as cloud and artificial intelligence to augment what humans do.

“We’ve definitely seen a big shift in the types of questions that you can answer enabled by new technology, we’re deep in the curve in terms of tech acceleration,” said Tony Frazier, Maxar executive vice president for global field operations.

As Frazier pointed out in a phone interview Tuesday, compute storage can be had at low or near-zero cost in a cloud environment and that facilitates collaborative innovation efforts in an open source community. And Maxar had a head start to some extent on making that shift.

“We had already gone through the exercise to move a hundred petabytes of our imagery into the Amazon cloud and enable the ecosystem of developers to build algorithms on top of that,” Frazier told me.

The piece on top of that cloud infrastructure is Maxar’s Geospatial Big Data Platform also known as GBDX, which was inherited through the company’s acquisition of DigitalGlobe two years ago that also brought in the latter’s vast Earth imagery library.

DARPA and Maxar are collaborating on the GCA to “enable analytics and geospatial data at scale” for analysts, Frazier told me, plus help users create algorithms to deploy that data across the platform. Cloud is an enabler for faster experiments and quicker deployments of new offerings at scale once users find something works, he told me.

DARPA first awarded Maxar a $3.7 million contract in September of last year to create the GCA platform for end users in the defense and intelligence community. The GBDX platform is the hub’s foundation and connects users to the company’s vast library of satellite imagery, plus other data from industry partners.

A follow-on contract announced last week is for $4.3 million and covers technical support to those who are building and training machine learning models inside the GCA. DARPA wants its technical area experts to be more able to make sense out of large volumes of geospatial data.

Other companies are part of the effort too. Lockheed Martin is working to create a machine learning model within GCA for identifying oil fracking sites and Culmen International is integrating data sets to detect and predict foreign civil unrest. Both companies are using Maxar imagery and other analytics products to create those offerings.

“Part of our model with this effort has not been just to enable essentially applied (research-and-development) by these performers to build out algorithms against the data sources that we’re capturing to our platform, but also to enable new business models,” Frazier said.

“As the performers come up with novel algorithms to identify unique signals, we’ve been working on new contracting mechanisms to make it easy for customers to acquire that at scale.”

One example he cited is the start of a new Earth observation “SIN” number on the General Services Administration’s Schedule 70 contract for IT acquisitions. The National Geospatial-Intelligence Agency partnered with GSA to create this “one-stop shop” for agencies to buy geospatial intelligence products and services

While not specifically called out in the release on the award, AI is certainly a key part of the equation for the GCA effort. The White House’s February executive order on prioritizing U.S. leadership in AI, the Defense Department’s AI strategy and the intelligence community’s own agenda are part of that backdrop.

“All of them are focused on how to harness pervasive sensors (and) more volume data on more rapid timelines as a way to make analysts more productive,” Frazier told me. “If you can train machines to be able to narrow the search space, take an image that covers a thousand square kilometers and focus on the few square kilometers that matter… that’s what they’re all trying to address right now.”