addie davisAI ethics debateAI mass surveillanceAI regulation disputeAnthropic lawsuitautonomous weapons concernsCalifornia lawsuitClaude AIDario Amodeidefense technology conflictDepartment of War

Anthropic sues Trump admin. over ‘supply chain risk’ designation – One America News Network

NEW YORK, NEW YORK - DECEMBER 03: New York Times columnist Andrew Ross Sorkin and CEO and co-founder of Anthropic Dario Amodei speak onstage during the 2025 New York Times Dealbook Summit at Jazz at Lincoln Center on December 03, 2025 in New York City. NYT columnist Sorkin hosted the annual Dealbook summit which brings together business and government leaders to discuss the most important stories across business, politics and culture. (Photo by Michael M. Santiago/Getty Images)
New York Times columnist Andrew Ross Sorkin and CEO and co-founder of Anthropic Dario Amodei speak onstage during the 2025 New York Times Dealbook Summit at Jazz at Lincoln Center on December 03, 2025 in New York City. (Photo by Michael M. Santiago/Getty Images)

OAN Staff Addie Davis
5:55 PM – Monday, March 9, 2026

The company Anthropic filed a lawsuit in California on Monday against a number of government officials and departments, including the Department of War (DoW), over its recent designation as a “supply chain risk.”

Anthropic describes itself as “an AI safety and research company that’s working to build reliable, interpretable, and steerable AI systems.”

“We do not believe this action is legally sound, and we see no choice but to challenge it in court,” Anthropic CEO Dario Amodei said in a statement.

The tension between Anthropic and the DoW centers on a fundamental disagreement over the boundaries of AI application.

 

According to the lawsuit, the DoW had issued a public ultimatum: Anthropic was told to “get on board” and yield to government demands by 5:01 p.m. on February 27, 2026, or “pay a price.” For its part, the company claims that it “remains firm in its refusal to allow its technology to be used for mass domestic surveillance or the development of fully autonomous weapons.”

Despite the friction, Secretary of War Pete Hegseth maintained that the government’s requirements for the AI were strictly for “all lawful purposes.”

Insider reports have stated that the Trump administration sought to leverage the AI to identify potential “sleeper cells” and “agents of chaos” — individuals believed to be linked to or sympathetic toward Islamist terrorist organizations. These efforts were reportedly intensified by the heightened regional instability and ongoing conflicts across the Middle East.

 

“AI-driven mass surveillance present serious, novel risks to our fundamental liberties,” the company said. “To the extent that such surveillance is currently legal, this is only because the law has not yet caught up with the rapidly growing capabilities.”

On the point of fully autonomous weapons, Anthropic further argued that AI capabilities are not currently reliable enough. The company added that these exceptions “have not been a barrier to accelerating the adoption and use of our models within our armed forces to date.”

The lawsuit follows the February 27th announcement by President Trump and Secretary Hegseth formally designating Anthropic as a “supply chain risk.”

 

“Our position has never wavered and will never waver: the Department of War must have full, unrestricted access to Anthropic’s models for every LAWFUL purpose in defense of the Republic,” Hegseth posted to X.

“Their true objective is unmistakable: to seize veto power over the operational decisions of the United States military. That is unacceptable,” he continued.

Trump also posted to Truth Social harshly criticizing the company.

“The Leftwing nut jobs at Anthropic have made a DISASTROUS MISTAKE trying to STRONG-ARM the Department of War, and force them to obey their Terms of Service instead of our Constitution. Their selfishness is putting AMERICAN LIVES at risk, our Troops in danger, and our National Security in JEOPARDY,” the president said.

“Therefore, I am directing EVERY Federal Agency in the United States Government to IMMEDIATELY CEASE all use of Anthropic’s technology,” he added.

“Effective immediately, no contractor, supplier, or partner that does business with the United States military may conduct any commercial activity with Anthropic,” Hegseth continued.

Amodei claimed, however, that the company received a letter on Wednesday from the DoW confirming the designation, but that it has “a narrow scope,” noting that “the vast majority of our customers are unaffected by a supply chain risk designation.”

“Even for Department of War contractors, the supply chain risk designation doesn’t (and can’t) limit uses of Claude or business relationships with Anthropic if those are unrelated to their specific Department of War contracts,” Amodei said.

In the suit, Anthropic argued that its “core First Amendment freedoms are under attack,” and highlighted the economic and business consequences of the designation. The company also claimed in a previous statement that the “supply chain risk” label should only be reserved for U.S. adversaries and has “never before applied to an American company.”

To ensure a smooth and seamless transition, the administration noted a six-month phase-out period during which Anthropic will continue providing services as the DoW migrates to a new partner.

In California, Anthropic’s filed lawsuit names various government departments, agencies, and officials — including Hegseth and Secretary of State Marco Rubio — in their official capacities. Simultaneously, according to The Hill, Anthropic filed a separate suit in Washington, D.C., specifically challenging the “supply chain risk” designation.

Anthropic’s AI model, Claude, has become deeply integrated into national security infrastructure. NBC News reports that the Pentagon has deployed Claude on classified networks via its partnership with Palantir to support intelligence assessments, targeting recommendations and complex battle simulations.

Stay informed! Receive breaking news blasts directly to your inbox for free. Subscribe here. https://www.oann.com/alerts

 

What do YOU think? Click here to jump to the comments!



Sponsored Content Below

 

Share this post!



Source link

Related Posts

1 of 921