Listen Live
Close
  • Nvidia vs Google in AI chip race, customers exploring alternatives to GPUs
  • OpenAI co-founder Sutskever sees shift from scaling to smarter research approaches
  • DeepMind documentary provides narrative context on how big AI labs operate
Australia Bans Social Media For Under 16s

Source: Matt Cardy / Getty

Tech Brief – Easing into Dec 2025

This week: a fresh push in the AI chip race between Nvidia and Google, a must-listen interview with OpenAI cofounder Ilya Sutskever about where AI research goes next, and a feature documentary from Google DeepMind now available on YouTube. TL;DR: hardware competition is heating up, leading researchers call for a research-first reset, and DeepMind’s doc gives a peek behind the curtain.

Nvidia vs Google — what’s happening

The AI hardware rivalry stepped up this month after reports that major cloud & AI customers are exploring Google’s Tensor Processing Units (TPUs) as alternatives to Nvidia’s GPUs — and even Meta has been linked to plans to rent/purchase Google’s TPUs for future use. That chatter shook markets and sparked public commentary from industry players as everyone sizes up whether Nvidia will keep long-term dominance. [Tom’s Hardware]

Bottom line: Nvidia’s GPUs (and its CUDA ecosystem) still power a huge share of the AI stack, but Google’s TPU lineup and cloud strategy are a real structural challenge — expect more partner deals, pricing plays, and ecosystem moves (not just product announcements) over the next 6–12 months. [Barron’s]

Read more: reporting from Tom’s Hardware, Bloomberg and sector analyses linked below.

Ilya Sutskever on the Dwarkesh podcast — highlights

In a wide-ranging conversation on Dwarkesh Patel’s show this week, OpenAI cofounder Ilya Sutskever argued that the industry is shifting “from an age of scaling to an age of research” — meaning the next phase will emphasize smarter scientific approaches, not just throwing more compute at the problem. He raised concerns about generalization, pre-training limits, and how to productively use massive compute resources. [Dwarkesh]

Why this matters: Sutskever helped build many early large-model ideas — when people at that level call for a research-first pivot, it signals where labs and funders may focus next: better algorithms, evaluation methods, and safety/robustness work rather than pure scale. For listeners, the interview is a useful temperature-check on industry direction. [Podwise]

Google DeepMind documentary — now on YouTube

The Thinking Game (a feature-length DeepMind documentary) is now publicly available on Google DeepMind’s YouTube channel and on Google’s blog. The film follows DeepMind’s team through major research moments and breakthroughs — including the AlphaFold story — and offers a human-facing view of how big AI labs operate. It’s a good watch if you want context on how research teams, engineering decisions, and long-term ambitions intersect. [blog.google]

Why these three stories together matter

  • Hardware shapes economics: who controls efficient compute affects pricing, access, and which companies can train the biggest models. Nvidia vs Google is a market & strategic story, not just a specs battle. [Barron’s]
  • Research vs scale: Sutskever’s comments reflect a potential sectoral shift — expect more R&D-focused headlines, funding rounds for research groups, and a push for new evaluation benchmarks. [Business Insider]
  • Human stories matter: DeepMind’s documentary provides narrative context — useful when you want to explain AI teams and tradeoffs to non-technical audiences. [blog.google]