Anthropic CEO Dario Amodei has reportedly been summoned by Secretary of War Pete Hegseth for a pivotal discussion at the Pentagon on Tuesday morning to discuss the military application of the startup’s AI chatbot Claude.

A high-ranking defense official told Axios on Monday, “This is not a friendly meeting”, anticipating a tense atmosphere.

While Anthropic told the publication that their ongoing discussions are “productive” and in “good faith,” defense officials have indicated that negotiations are teetering on the edge of failure, with no significant headway made thus far.

A report last week suggested that Hegseth had been considering severing ties with Anthropic and that the AI startup could soon be labeled as a “supply chain risk.” Anthropic is open to easing its terms of service but insists on barring its technology from being used for mass surveillance of Americans or the development of autonomous weapons. The Department of War, however, views those restrictions as overly limiting.

The Department of War and Anthropic did not immediately respond to Benzinga‘s requests for comment.

Anthropic Flags AI Account Fraud, Elon Musk Reacts

Adding to the complexity, Anthropic recently accused three Chinese AI companies of creating 24,000 fraudulent accounts to exploit its Claude chatbot. This alleged theft of AI model capabilities is believed to be the largest documented case of its kind to date.

In response, Elon Musk posted on X, “Anthropic is guilty of stealing training data at massive scale and has had to pay multi-billion dollar settlements for their theft. This is just a fact.”

The development comes after Anthropic, on Friday, launched Claude Code Security, an AI-powered tool built into its coding platform that autonomously scans entire codebases for vulnerabilities. The company said its Opus 4.6 model uncovered more than 500 previously unknown high-severity flaws in live open-source projects.

Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.

Image via Shutterstock