Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Anthropic wins court backing to block Trump's ban on using its AI tools
Legal standoff between Anthropic and the U.S. Department of Defense makes preliminary progress.
On Thursday, March 26, a federal judge in California ruled that the U.S. government’s punitive measures against the AI company Anthropic appear to be beyond its authority, and ordered a temporary halt to the related actions.
District Court Judge Rita F. Lin granted Anthropic’s application for a temporary restraining order, requiring the government to pause its punitive measures against the company for one week, while the court continues to hear the case. The judge said:
Previously, Anthropic argued that the U.S. Department of Defense and the Trump administration had labeled it as a “supply chain risk” and ordered government agencies to stop using its technology, violating the company’s First Amendment rights.
The key point of contention—restrictions on weapon use—sparks confrontation
The months-long standoff stems from Anthropic’s refusal to allow the Department of Defense to use its Claude models in fully autonomous lethal weapon systems or for domestic mass surveillance.
Earlier this month, Anthropic filed a lawsuit. In its complaint, the company argued that the supply chain risk designation and other punitive measures could cost the company hundreds of millions, or even tens of billions, of dollars. In the complaint, the company said:
At present, the temporary injunction will remain in place for one week as the court continues to review the case.
The outcome of this case will not only affect Anthropic’s own commercial interests, but also set a broader precedent regarding the boundaries of the government’s authority to procure AI services, and whether technology companies can include ethical-use restrictions in contracts.
In addition, the ruling will deliver a direct blow to the government’s plan to replace Claude with federal agencies. Given that Anthropic’s technology is deeply embedded in government operations, the replacement process has already faced substantial difficulty.
The U.S. Department of Defense has previously used Claude extensively in military operations, including tasks such as target selection and analysis of missile strikes.
Judge questions government logic; defense secretary’s statements come under scrutiny
During the hearing, Judge Lin showed clear skepticism toward the government’s arguments.
She pressed the government’s attorney, asking why, if the Department of Defense could simply terminate its contract with Anthropic directly, it needed to invoke the supply chain risk designation mechanism, and said bluntly that “this looks like a deliberate tactic to suppress Anthropic.”
Regarding issues involving Defense Secretary Pete Hegseth, the government’s attorney argued that Hegseth posted on social media that any contractors doing business with the U.S. armed forces must not collaborate with Anthropic, but that statement had no legal effect and therefore did not constitute the irreparable harm the company alleged in its complaint.
When Lin pressed the government’s attorney on why Hegseth would make a statement with no legal basis to support it, the attorney said he could not answer.
Lin also pointed out that the government’s series of actions against Anthropic appears to be not based on specific national security considerations. Lin wrote in the ruling: