Economy March 11, 2026

Anthropic’s Challenge to Pentagon Blacklisting Sets Up Rare Legal Test of Supply-Chain Statute

Lawsuit argues designation violates free speech and due process as experts question the government’s statutory reach

By Jordan Park
Anthropic’s Challenge to Pentagon Blacklisting Sets Up Rare Legal Test of Supply-Chain Statute

Anthropic has sued to overturn a Pentagon determination that it is a national security supply chain risk, arguing the decision infringes on its First and Fifth Amendment rights and exceeded the limits of a rarely used procurement statute. Legal scholars say the case could force courts to clarify whether Section 3252 - the law invoked by the Defense Department - can apply to a U.S.-based company with no apparent foreign entanglements. The dispute highlights tensions between military operational concerns and company policies restricting use of AI for autonomous weapons and domestic surveillance.

Key Points

  • Anthropic has challenged the Pentagon’s designation of the company as a national security supply chain risk, arguing it violates First and Fifth Amendment protections and improperly applies Section 3252.
  • Section 3252 - used by the Defense Department to exclude companies from certain contracts to guard against sabotage or infiltration - is rarely used and, according to a review of legal databases, has not previously been tested against a U.S. company in court.
  • The dispute affects defense procurement and the commercial AI sector, with potential impacts on military contracting, company revenues, and how AI usage policies interact with national security procurement decisions.

Anthropic has filed suit seeking to overturn a Defense Department decision that labeled the artificial intelligence developer a national security supply chain risk, a designation that the company says unlawfully bars it from certain military contracts and punishes its public stance on AI safety. The company's complaint, filed on Monday, contends the department’s action violated Anthropic’s rights under the First and Fifth Amendments and relied on a procurement statute whose application to a domestic firm has not been tested in court.

The Pentagon invoked Section 3252 - a provision designed to prevent adversaries from sabotaging or infiltrating federal information technology systems - when it announced the exclusion. That statute permits the defense secretary to prevent companies from bidding for certain contracts if their involvement could make military information systems vulnerable to an adversary’s interference, including sabotage or the malicious introduction of unwanted functionality that could surveil, deny, disrupt, or otherwise degrade system operations.

According to a review of legal databases, the provision has rarely been used and has not previously been litigated against a U.S. company. That procedural rarity is central to the legal fight now unfolding, as courts typically give wide deference to executive-branch judgments on national security issues. The government’s lawyers are expected to lean heavily on that deference in defending the designation.

Anthropic’s complaint contends the department’s move was both punitive and lacking in due process. The company says the exclusion followed its refusal to remove safety-oriented restrictions on Claude, its flagship AI tool, which bar the military from deploying Claude for autonomous weapons and for domestic surveillance. Company executives told reporters on Tuesday that the designation could slice multiple billions of dollars from Anthropic’s projected 2026 revenue and damage the firm’s reputation.

The legal pleadings also highlight contradictory government statements that Anthropic and outside lawyers argue undercut the Pentagon’s position. The suit notes that Defense Secretary Pete Hegseth described Claude as "exquisite" technology and said during a February 24 meeting that the department would "love" to work with it. The complaint further cites the continued use of Claude by the military as recently as last month during strikes on Iran, a point that raises questions about the department’s rationale for categorizing the company as a supply chain risk.

Hegseth designated Anthropic a national security supply chain risk on March 3 after the company declined to lift usage restrictions. In a public post on February 27 announcing the designation, Hegseth accused Anthropic of cloaking itself in the "sanctimonious rhetoric of 'effective altruism'" to "strong-arm the United States military into submission." The department has argued that Anthropic’s constraints on Claude could endanger American lives. Anthropic in turn argues that its positions reflect concern that current AI systems are not reliable enough for autonomous weaponry and a principled opposition to domestic surveillance.

The Pentagon also invoked a separate, related statute in designating Anthropic a supply chain risk, a move that could ultimately widen the effect of the exclusion to civilian government contracts as well. Under Section 3252, however, the department is meant to use exclusion as a measure of last resort and is not required to compel other contractors to cease work with the barred firm entirely. Publicly available information does not show other companies previously designated under that statute, though the law does not require public disclosure of such determinations.

Legal scholars who reviewed the case say the government faces a steep evidentiary bar if it seeks to justify the designation as necessary to prevent foreign subversion. "It’s not at all clear that the statute can even apply to an American company where there’s no foreign entanglement," said Alan Rozenshtein, a professor at the University of Minnesota Law School. Amos Toh, a national security law expert at the Brennan Center for Justice, said he saw nothing in Claude’s published usage policies that would appear to create a realistic risk of foreign sabotage or subversion. "These are basically safety protocols," he said. "You can debate whether these protocols are acceptable or not, but they run directly counter to the risk that the law is designed to regulate."

Anthropic’s lawsuit also alleges the designation was motivated in part by political animus and personal hostility. The complaint cites public statements by senior administration figures, including a social media post in which President Donald Trump called Anthropic a "RADICAL LEFT WOKE COMPANY," arguing those comments bolster the company’s claim that the action was aimed at punishing its views rather than addressing genuine security vulnerabilities. Joel Dodge, a law professor at Vanderbilt University, said the combination of critical public rhetoric and the department’s conduct "undermine their case and suggest there was personal animus and bad blood between the parties, and that the Pentagon had it out for Anthropic."

Beyond First Amendment concerns, Anthropic contends the supply chain risk finding trampled its Fifth Amendment right to due process by imposing severe penalties without meaningful procedural protections, factual findings, or an opportunity to challenge the underlying basis for exclusion. The company has also challenged the related, broader civilian-designation under the same rubric.

For its part, the government could argue that judicial restraint is warranted because courts typically do not second-guess the president and cabinet secretaries on matters of national defense and procurement. Justice Department attorneys may point to precedent holding that contract decisions do not constitute First Amendment violations when they are supported by legitimate policy or operational considerations. The administration could also argue that the president and his cabinet have broad discretion in choosing suppliers for national defense work and that a vendor whose usage policies limit military options cannot reasonably be relied upon.

Outside government-contract specialists say excluding a supplier under Section 3252 amounts to a severe penalty that requires the department to demonstrate that no reasonable alternative existed and that it carefully considered other steps before proceeding. Eric Crusius, an attorney who specializes in government contracts, described the action as akin to imposing the "death penalty" on a company and said the government will need to show it meticulously weighed other options prior to the exclusion.

Anthropic also argues that the decision should be set aside under the Administrative Procedure Act, which allows courts to overturn agency actions that are "arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law." Several legal experts told the company’s filing paints a picture of inconsistent policy positions by the government: threatening to use the Defense Production Act to compel Anthropic to sell services, simultaneously deploying Claude in active operations, and asserting the system is too risky for government contracts. "Not all of these things can be true," Rozenshtein observed, and that tension may be central to the company’s claim that the department’s action was arbitrary.

The pending litigation is likely to force courts to reconcile competing principles: deference to executive-branch national security judgments and the protections afforded by constitutional and administrative law. Because Section 3252 has little judicial history and few public precedents, the case could produce guidance on how far the statute reaches when applied to a U.S.-based technology provider whose usage restrictions are framed as safety measures rather than foreign entanglements.

Until a court rules, the designation will remain a cloud over Anthropic’s ability to bid for certain defense work and could have significant financial and reputational consequences. The dispute also raises broader questions for defense procurement and the commercial AI sector about how policy choices by technology companies - especially those concerning limits on military or surveillance use - intersect with national security priorities and procurement law.


Summary

Anthropic has sued the Defense Department to block a supply-chain risk designation that bars it from certain military contracts, arguing the order violates its constitutional rights and misapplies Section 3252, a little-used procurement statute. Legal experts say the case will test whether that law can apply to a U.S.-based firm and whether public statements from senior officials undercut the government’s asserted national security rationale.

Risks

  • Legal uncertainty over the scope of Section 3252 - courts typically defer to executive national security judgments, which could favor the government and leave the statute’s reach unresolved - impacts defense contracting and AI vendors.
  • Financial and reputational harm to Anthropic if the exclusion persists - executives estimate the designation could reduce 2026 revenue by multiple billions and damage the company’s standing with government and commercial customers.
  • Potential policy contradictions and operational inconsistencies - simultaneous government statements about compelling services, using Claude in active operations, and deeming it too risky for contracts raise questions that could prolong litigation and complicate procurement planning in defense and civilian agencies.

More from Economy

Iran Conflict Pushes Fed Outlook Toward Later, Possibly Deeper Cuts Mar 11, 2026 Mortgage Costs Jump After Largest Weekly Rise Since September Mar 11, 2026 Justice Department Opens Inquiry into Iran's Use of Binance to Bypass U.S. Sanctions Mar 11, 2026 U.S. P-8A Flies Through Taiwan Strait as U.S.-China Talks Loom Mar 11, 2026 Avolta Says Too Soon to Gauge Business Impact from Middle East Conflict Mar 11, 2026