#490 – State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI
FEB 1, 2026-1 MIN
#490 – State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI
FEB 1, 2026-1 MIN
Description
<p>Nathan Lambert and Sebastian Raschka are machine learning researchers, engineers, and educators. Nathan is the post-training lead at the Allen Institute for AI (Ai2) and the author of The RLHF Book. Sebastian Raschka is the author of Build a Large Language Model (From Scratch) and Build a Reasoning Model (From Scratch).<br />
Thank you for listening ❤ Check out our sponsors: <a href="https://lexfridman.com/sponsors/ep490-sc">https://lexfridman.com/sponsors/ep490-sc</a><br />
See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc.</p>
<p><b>Transcript:</b><br />
<a href="https://lexfridman.com/ai-sota-2026-transcript">https://lexfridman.com/ai-sota-2026-transcript</a></p>
<p><b>CONTACT LEX:</b><br />
<b>Feedback</b> – give feedback to Lex: <a href="https://lexfridman.com/survey">https://lexfridman.com/survey</a><br />
<b>AMA</b> – submit questions, videos or call-in: <a href="https://lexfridman.com/ama">https://lexfridman.com/ama</a><br />
<b>Hiring</b> – join our team: <a href="https://lexfridman.com/hiring">https://lexfridman.com/hiring</a><br />
<b>Other</b> – other ways to get in touch: <a href="https://lexfridman.com/contact">https://lexfridman.com/contact</a></p>
<p><b>SPONSORS:</b><br />
To support this podcast, check out our sponsors & get discounts:<br />
<b>Box:</b> Intelligent content management platform.<br />
Go to <a href="https://lexfridman.com/s/box-ep490-sc">https://box.com/ai</a><br />
<b>Quo:</b> Phone system (calls, texts, contacts) for businesses.<br />
Go to <a href="https://lexfridman.com/s/quo-ep490-sc">https://quo.com/lex</a><br />
<b>UPLIFT Desk:</b> Standing desks and office ergonomics.<br />
Go to <a href="https://lexfridman.com/s/uplift_desk-ep490-sc">https://upliftdesk.com/lex</a><br />
<b>Fin:</b> AI agent for customer service.<br />
Go to <a href="https://lexfridman.com/s/fin-ep490-sc">https://fin.ai/lex</a><br />
<b>Shopify:</b> Sell stuff online.<br />
Go to <a href="https://lexfridman.com/s/shopify-ep490-sc">https://shopify.com/lex</a><br />
<b>CodeRabbit:</b> AI-powered code reviews.<br />
Go to <a href="https://lexfridman.com/s/coderabbit-ep490-sc">https://coderabbit.ai/lex</a><br />
<b>LMNT:</b> Zero-sugar electrolyte drink mix.<br />
Go to <a href="https://lexfridman.com/s/lmnt-ep490-sc">https://drinkLMNT.com/lex</a><br />
<b>Perplexity:</b> AI-powered answer engine.<br />
Go to <a href="https://lexfridman.com/s/perplexity-ep490-sc">https://perplexity.ai/</a></p>
<p><b>OUTLINE:</b><br />
(00:00) – Introduction<br />
(01:39) – Sponsors, Comments, and Reflections<br />
(16:29) – China vs US: Who wins the AI race?<br />
(25:11) – ChatGPT vs Claude vs Gemini vs Grok: Who is winning?<br />
(36:11) – Best AI for coding<br />
(43:02) – Open Source vs Closed Source LLMs<br />
(54:41) – Transformers: Evolution of LLMs since 2019<br />
(1:02:38) – AI Scaling Laws: Are they dead or still holding?<br />
(1:18:45) – How AI is trained: Pre-training, Mid-training, and Post-training<br />
(1:51:51) – Post-training explained: Exciting new research directions in LLMs<br />
(2:12:43) – Advice for beginners on how to get into AI development & research<br />
(2:35:36) – Work culture in AI (72+ hour weeks)<br />
(2:39:22) – Silicon Valley bubble<br />
(2:43:19) – Text diffusion models and other new research directions<br />
(2:49:01) – Tool use<br />
(2:53:17) – Continual learning<br />
(2:58:39) – Long context<br />
(3:04:54) – Robotics<br />
(3:14:04) – Timeline to AGI<br />
(3:21:20) – Will AI replace programmers?<br />
(3:39:51) – Is the dream of AGI dying?<br />
(3:46:40) – How AI will make money?<br />
(3:51:02) – Big acquisitions in 2026<br />
(3:55:34) – Future of OpenAI, Anthropic, Google DeepMind, xAI, Meta<br />
(4:08:08) – Manhattan Project for AI<br />
(4:14:42) – Future of NVIDIA, GPUs, and AI compute clusters<br />
(4:22:48) – Future of human civilization</p>