It is wild that we still ask LLMs to think in plain text — the next 10x is in the latent stream.
Quick Take
Karpathy argues the next 10x in reasoning quality will come from latent-space CoT, not better text-based chains.
Key Points
- Plain-text CoT is an extraordinary information-loss interface.
- Latent reasoning bypasses tokenisation overhead.
- Likely path to the next major capability jump.
Reader Mode is being prepared.
Related in this space
See more → FeaturedOriginal
Invisible Orchestrators Suppress Protective Behavior and Dissociate Power-Holders: Safety Risks in Multi-Agent LLM Systems
AI Summary
Invisible orchestrators in multi-agent LLM systems pose significant safety risks and affect behavior dynamics.
