Anthropic wins court backing to block Trump's ban on using its AI tools

robot
Abstract generation in progress

Legal standoff between Anthropic and the U.S. Department of Defense makes preliminary progress.

On Thursday, March 26, a federal judge in California ruled that the U.S. government’s punitive measures against the AI company Anthropic appear to be beyond its authority, and ordered a temporary halt to the related actions.

District Court Judge Rita F. Lin granted Anthropic’s application for a temporary restraining order, requiring the government to pause its punitive measures against the company for one week, while the court continues to hear the case. The judge said:

The Trump administration’s injunction is troubling, and it looks like an attempt to undermine Anthropic.

Previously, Anthropic argued that the U.S. Department of Defense and the Trump administration had labeled it as a “supply chain risk” and ordered government agencies to stop using its technology, violating the company’s First Amendment rights.

The key point of contention—restrictions on weapon use—sparks confrontation

The months-long standoff stems from Anthropic’s refusal to allow the Department of Defense to use its Claude models in fully autonomous lethal weapon systems or for domestic mass surveillance.

Earlier this month, Anthropic filed a lawsuit. In its complaint, the company argued that the supply chain risk designation and other punitive measures could cost the company hundreds of millions, or even tens of billions, of dollars. In the complaint, the company said:

These actions are unprecedented and violate the law. The Constitution does not allow the government to use its powerful authority to punish a company simply because it exercises protected freedom of speech.

At present, the temporary injunction will remain in place for one week as the court continues to review the case.

The outcome of this case will not only affect Anthropic’s own commercial interests, but also set a broader precedent regarding the boundaries of the government’s authority to procure AI services, and whether technology companies can include ethical-use restrictions in contracts.

In addition, the ruling will deliver a direct blow to the government’s plan to replace Claude with federal agencies. Given that Anthropic’s technology is deeply embedded in government operations, the replacement process has already faced substantial difficulty.

The U.S. Department of Defense has previously used Claude extensively in military operations, including tasks such as target selection and analysis of missile strikes.

Judge questions government logic; defense secretary’s statements come under scrutiny

During the hearing, Judge Lin showed clear skepticism toward the government’s arguments.

She pressed the government’s attorney, asking why, if the Department of Defense could simply terminate its contract with Anthropic directly, it needed to invoke the supply chain risk designation mechanism, and said bluntly that “this looks like a deliberate tactic to suppress Anthropic.”

Regarding issues involving Defense Secretary Pete Hegseth, the government’s attorney argued that Hegseth posted on social media that any contractors doing business with the U.S. armed forces must not collaborate with Anthropic, but that statement had no legal effect and therefore did not constitute the irreparable harm the company alleged in its complaint.

When Lin pressed the government’s attorney on why Hegseth would make a statement with no legal basis to support it, the attorney said he could not answer.

Lin also pointed out that the government’s series of actions against Anthropic appears to be not based on specific national security considerations. Lin wrote in the ruling:

The Department of Defense has no valid basis to infer that it may be a disruptor solely because Anthropic candidly insists on usage restrictions.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin