2025-08-30
Andrej Karpathy, a former Tesla and OpenAI researcher, is part of a growing movement in the AI community calling for a new approach to building large language models (LLMs) and AI systems.
The article AI researcher Andrej Karpathy says he's "bearish on reinforcement learning" for LLM training appeared first on THE DECODER.
[...]2025-03-13
Former OpenAI researcher Andrej Karpathy envisions a future where large language models (LLMs) become the primary interface for content.<br /> The article AI expert Andrej Karpathy envisions a w [...]
2025-10-01
Thinking Machines, the AI startup founded earlier this year by former OpenAI CTO Mira Murati, has launched its first product: Tinker, a Python-based API designed to make large language model (LLM) fin [...]
2025-10-09
Researchers at Nvidia have developed a new technique that flips the script on how large language models (LLMs) learn to reason. The method, called reinforcement learning pre-training (RLP), integrates [...]
2025-06-28
Shopify CEO Tobi Lütke and former Tesla and OpenAI researcher Andrej Karpathy say "context engineering" is more useful than prompt engineering when working with large language models.<br [...]
2025-09-30
Meta’s AI research team has released a new large language model (LLM) for coding that enhances code understanding by learning not only what code looks like, but also what it does when executed. The [...]
2025-10-08
The trend of AI researchers developing new, small open source generative models that outperform far larger, proprietary peers continued this week with yet another staggering advancement.Alexia Jolicoe [...]
2025-10-01
It’s been a rough time for Peloton. Last year was marred by deep staff cuts, a change of CEO and a reckoning of where the home fitness company belonged, post-Pandemic boom. The answer is, unfortunat [...]
2025-10-02
A new study by Shanghai Jiao Tong University and SII Generative AI Research Lab (GAIR) shows that training large language models (LLMs) for complex, autonomous tasks does not require massive datasets. [...]