Cubic's investment moves toward data at speed and scale
- By Ross Wilkers
- Aug 13, 2019
Cubic Corp. has acquired a 20-percent stake in a Herndon, Virginia-based commercial software provider to take that outfit’s storage and processing tools into national security agencies that work with large swaths of imagery and video.
And if the arrangement progresses well for both parties, San Diego-based Cubic can exercise an option to acquire the remaining 80 percent of Pixia by February 2020. Cubic announced the transaction Thursday in its third quarter earnings release and conference call with investors.
Pixia has been on Cubic’s radar screen in recent years as the latter has been an active acquirer of other defense technology companies, said Mike Twyman, president of Cubic’s mission solutions segment.
Both Cubic and the acquired companies “have been working on and off with Pixia… so we had good awareness of them and their technology,” Twyman told me.
Pixia was founded in 1999 and built its software platform to handle all kinds of live, dynamic data beyond still imagery and video. The tool also manages geospatial, radar and infrared data and is in use at defense and intelligence agencies: who all are faced with the challenge of taking all that information in and applying it at scale.
“Their core technology optimizes data storage and dissemination in the cloud or on-premise, and really provides significant advantages for their customers in terms of the ability to access information and update information,” Twyman told me.
Twyman said that in Pixia, Cubic found a technology asset complementary to its video distribution portfolio defense agencies use along with certain commercial customers.
He described two of Cubic’s main lines of thinking that led to the partnership with Pixia. The first one we discussed is how users get the data.
“Between us and Pixia, we ingest multiple types of geospatial data from commercial and military sensors,” Twyman said. “We provide access to that information to allow users to develop deeper situational understanding.”
From there, comes what users do with the information. And a big part of that involves automation.
“Big data tools like this are necessary to enable the (artificial intelligence) and machine learning algorithms that are being built to further provide additional situational understanding in the communities that we serve,” Twyman said.
Twyman’s comments there also help illustrate a common theme that comes up constantly in conversations we have with executives. Analysts, technicians and other users still have to at varying degrees manually sort through large amounts of data that is often unstructured in form.
That is where automation technologies like AI and machine learning theoretically come in to help augment some of those functions and free up analysts to spend more time on applying the data and making decisions.
“You can see things in seconds as opposed to tens of minutes in terms of just accessing information, so our target is to leverage the human brain power,” Twyman said. “Having the ability to access the information quickly and process it to get additional insights from the big data is a logical evolution.
“As that matures, we’re in a position to help that.”
Ross Wilkers is a senior staff writer for Washington Technology. He can be reached at firstname.lastname@example.org. Follow him on Twitter: @rosswilkers. Also connect with him on LinkedIn.