Eight tech companies are supplying AI for classified US military networks, part of the Pentagon's push to build an "AI-first fighting force." Anthropic is notably absent from the list after the company rejected a usage clause and got flagged as a security risk.<br /> The article Eight tech giants sign Pentagon deals to build an "AI-first fighting force" across classified networks appeared first on The Decoder. [...]
Google has signed a deal that allows the US Department of Defense to use its AI models for "any lawful government purpose." This is according to a report by The Information, which also notes [...]
OpenAI on Monday launched a set of interactive visual tools inside ChatGPT that let users manipulate mathematical and scientific formulas in real time — a genuinely impressive education feature that [...]
The Pentagon is making plans to have AI companies train versions of their models specifically for military use on classified information, according to the MIT Technology Review. If true, it wouldn’t [...]
The Pentagon is pressing leading AI companies including OpenAI, Anthropic, Google, and xAI to make their AI tools available on classified military networks - without the usual usage restrictions.<b [...]
Defense Secretary Pete Hegseth will reportedly give Anthropic until Friday to drop certain guardrails for military use, as reported by Axios. The outlet also reported that CEO Dario Amodei met with He [...]
Hundreds of employees at Google and OpenAI have signed an open letter urging their companies to stand with Anthropic in its standoff with the Pentagon over military applications for AI tools like Clau [...]
OpenAI struck a deal with the Pentagon just hours after Anthropic was barred from government contracts. OpenAI claims to operate under the same safety principles as Anthropic, but the language both co [...]
The US Department of War is working to set up secure environments where AI companies can train their models on classified data. Until now, models were only allowed to read classified data, not learn f [...]