Musk's Colossus 1 AI supercomputer's inefficient mixed-architecture design couldn't be used to train Grok, so Anthropic's using it for inference instead — Musk readies unified Blackwell-only Colossus 2 for frontier training and potential IPO · DeepSignal
Musk's Colossus 1 AI supercomputer's inefficient mixed-architecture design couldn't be used to train Grok, so Anthropic's using it for inference instead — Musk readies unified Blackwell-only Colossus 2 for frontier training and potential IPO
OpenClaw creator burned through $1.3 million in OpenAI API tokens in a single month — bill covered 603 billion tokens across 7.6 million requests and 100 coding agents
AI Summary
OpenClaw's creator spent $1.3 million on 603 billion OpenAI tokens in one month.
Apache helicopter's 'loyal wingmen' support drones to be used for precision strikes, other duties — British Army's Project NYX funding money goes to four firms as effort hits new milestone
AI Summary
UK MoD selects four firms to develop autonomous 'loyal wingmen' drones for Apache helicopters.
Musk's shift from Colossus 1 to Colossus 2 highlights the importance of efficient architecture in AI training, signaling developers and investors to prioritize scalable designs for future-proofing their projects.