AI executive order draws some praise, but where are the resources?

Gettyimages.com / Funtap

Find opportunities — and win them.

The order’s guardrails on artificial intelligence development and adoption also spotlight the importance, quality and location of data.

President Biden's artificial intelligence executive order and companion guidance document released Monday seem to be drawing generally good reviews across industry.

One lingering question is being raised by some executives however – where is the money to pay for the requirements listed in the document?

The order includes reporting and review requirements, as well as guidance for what agencies should and should not do.

But with no clear funding source, former Navy Chief Information Officer Rob Carey is worried the order will create the opposite of what is intended – stifling innovation.

“I’m not surprised by the breadth and depth, but I don’t see the resources to deliver the outcomes it calls for,” said Carey, now president of Cloudera's government solutions division.

Carey added the reporting and review requirements could act as a throttle on innovation because it isn’t clear that the specific agencies called out in the order have the people and structures in place to meet the requirements.

But the requirements could have one benefit, said Kevin Plexico, senior vice president of information solutions at Deltek.

Plexico compared the order and current level of interest in AI to past technology waves the government has gone through.

“It reminds me of the journey the government has been on with cloud adoption and FedRAMP,” Plexico said.

The Federal Risk and Authorization Management Program launched in 2011 to implement a more standardized approach for security assessments and authorizations for cloud computing products and services.

Early on in FedRAMP's history, there were requirements that agencies had to meet before they could adopt cloud solutions.

Plexico said Biden's new AI order is laying the foundation for how agencies need to think about using AI and its issues with bias, equity, privacy, data security and the workforce.

“The federal government has such a risk aversion to new technologies because they don’t want to introduce problems in an unintended way and I think generative AI is like that,” Plexico said. “While the requirements might seem like they’ll slow down adoption, the requirements do give a good line of sight on how to adopt it.”

FedRAMP required protections and safeguards that otherwise would not have been there.

“I wonder where we’d be if we had gone down the path of adopting the cloud without some of these protections in place,” he said.

FedRAMP drove adoption of cloud computing because it provided guardrails.

“It has done more to drive adoption of the cloud in government than almost any policy initiative,” Plexico said.

When signed in 2002, the Federal Information Security Management Act mostly had paper-based reporting and review requirements. But industry quickly responded with tools and software solutions.

Plexico expects the same to happen with AI.

“To put in these monitoring and testing requirements it’ll be very manual and labor-intensive work but I think very shortly thereafter we’ll see tools and technologies to accelerate that,” Plexico said.

Carey sees one positive in the order: what it says about the importance of knowing where your data is and the formats it is in.

“This shines a very bright spotlight in my favorite subject – data,” Carey said. “If you don’t have your data act together, you can’t even play in the AI game.”

Agencies need to be able to trust their data before they can move into AI or machine learning.

“This is an exercise that says – 'Get your data house in order before you even start this journey,'” Carey said.