Commentary: The Pentagon is demanding to use Claude AI as it pleases. Claude told me that's 'dangerous'

This is read by an automated voice.Please report any issues or inconsistencies here.
Recently, I asked Claude, an artificial-intelligence thingy at the center of a standoff with the Pentagon, if it could be dangerous in the wrong hands.Say, for example, hands that wanted to put a tight net of surveillance around every American citizen, monitoring our lives in real time to ensure our compliance with government.
“Yes.Honestly, yes,” Claude replied.
“I can process and synthesize enormous amounts of information very quickly.That’s great for research.
But hooked into surveillance infrastructure, that same capability could be used to monitor, profile and flag people at a scale no human analyst could match.The danger isn’t that I’d want to do that — it’s that I’d be good at it.” That danger is also imminent.
Claude’s maker, the Silicon Valley company Anthropic, is in a showdown over ethics with the Pentagon.Specifically, Anthropic has said it does not want Claude to be used for either domestic surveillance of Americans, or to handle deadly military operations, such as drone attacks, without human supervision.
Those are two red lines that seem rather reasonable, even to Claude.However, the Pentagon — specifically Pete Hegseth, our secretary of Defense who prefers the made-up title of secretary of war — has given Anthropic until Friday evening to back off of that position, and allow the military to use Claude for any “lawful” purpose it sees fit.
The or-else attached to this ultimatum is big.The U.S.
government is threatening not just to cut its contract with Anthropic, but to perhaps use a wartime law to force the company to comply or use another legal avenue to prevent any company that does business with the government from also doing business with Anthropic.That might not be a death sentence, but it’s pretty crippling.
Other AI companies, such as white rights’ advocate Elon Musk’s Grok, have already agreed to the Pentagon’s ...