Microsoft takes Anthropic's side in DOD fight, warns it sets a new precedent

Gettyimages.com/SOPA Images / Contributor
In a court briefing, Microsoft argues the Defense Department is using a national security policy designed for foreign adversaries against a U.S. company over a contract dispute.
Microsoft has filed a brief in support of Anthropic in the artificial intelligence tech company's ongoing battle with the Defense Department.
The company filed on Tuesday at the U.S. District Court in Northern California, where Anthropic filed suit earlier this month challenging DOD’s determination that the company is a supply chain risk to national security.
Microsoft is arguing for a temporary restraining order on enforcement of the determination. Microsoft believes the ban would harm the company and other contractors that have deeply embedded Anthropic’s technology into their products.
Known as an amicus brief, the three-page document from Microsoft also lays out its argument over why DOD’s determination that Anthropic is a supply chain risk sets a dangerous precedent that puts all government contractors at risk.
A restraining order would also buy time for the two sides to resolve their dispute.
"We believe everyone involved shares common goals, and we need time and a process to find common ground," a Microsoft spokesperson said. "Everyone wants to ensure AI not used for mass domestic surveillance or to start a war without human control. The government, the entire tech sector, and the American public need a path to achieve all these goals together."
In its brief, Microsoft argues that DOD’s determination is an unprecedented use of the statute that describes “supply chain risk.”
This statute has never been used against a U.S. company before and has only been used against one foreign company, the Switzerland-headquartered Acronis AG.
In July 2025, the Office of the Director of National Intelligence issued an order prohibiting the use of Acronis products by intel agencies. The General Services Administration expanded the prohibition to all agencies.
Microsoft calls the action against Anthropic “drastic.” After DOD made its determination, President Trump ordered all federal agencies to stop using Anthropic.
“The determination has, without explaining the basis, labeled Anthropic a ‘supply chain risk’ against whom extraordinary measures are needed ‘to protect national security,’” Microsoft wrote. “The authority for the determination itself permits this action only against an adversary that poses an articulated threat to the United States.”
The word “adversary” is a key part of Microsoft’s argument that because Anthropic is a U.S. company, declaring them an adversary over a contract dispute is extreme.
Microsoft argues DOD does not explain why it considers Anthropic an adversary, which the statute requires before such a determination can be issued.
Microsoft also argues that a negotiated settlement is possible because DOD and Anthropic fundamentally agree on the guardrails that should govern the use of AI.
Their dispute arose over the specific terms and conditions. In a footnote, Microsoft refers to DOD’s recent agreement with OpenAI as proof that negotiations are possible.
A temporary restraining order would allow time for the negotiation without companies like Microsoft having to dismantle products containing Anthropic, which could be extremely disruptive and expensive.
“Government suppliers will also have to expend substantial effort removing Anthropic and Anthropic products from their offerings to [DOD] in cases where alternatives are unavailable or Anthropic products are embedded,” Microsoft writes. “The costs for these actions—including reengineering, reprocurement, and associated legal and administrative costs—will be incurred immediately as suppliers will have to invest time, energy, personnel, and money into modifying and rebuilding offerings that incorporate Anthropic’s products and confirming the new versions of those offerings meet the contractual requirements.”
DOD’s action against Anthropic has the potential to delay all ongoing IT contracting at the department because contractors will have to review all their offerings to identify where they are using Anthropic, Microsoft said.
Microsoft wants the restraining order so the court can determine whether DOD followed the statutory requirements to make the determination.
Ultimately, Microsoft's brief frames DOD's action as a dramatic overreach because of the determination process' intent as focusing on foreign adversaries.