Token Intelligence

Eric Dodds & John Wessel

Two friends break down AI, technology, and entrepreneurship through mental models, real-world experience and the pursuit of a life well-lived.

Episodes

  1. 1D AGO

    Shorts: the chat interface problem and Lobe's imaginary seed round

    Eric and John riff on Lobe's seed round, then dive deep on why chat is the wrong UI for most AI. They unpack the blank page problem, why context matters, and how embedded AI will replace chat. Summary In Episode 2, Lobe gets a theoretical 3 million dollar seed round, and Eric and John discuss how they are going to deploy the capital, which includes potential acquisitions. Next, they dive into a detailed discussion about why chat is a ubiquitous UI for AI. Eric feels very strongly about the shortcomings, which include poor literacy rates, the blank page problem, and which use cases chat is actually good for. The why is even more interesting, and their hypothesis is that cost is one of the primary drivers because of how expensive it is to run models at scale. They wrap up by imagining a future where AI disappears from interfaces altogether, and is embedded natively in intuitive, multi-model user experiences. Key takeaways Lobe.ai Lobe’s path forward: acquire and partner for distribution (apps/sleep brands), integrate biometrics for REM triggers, and monetize interpretation and creative outputs. The AI chat interface Chat is the wrong default interface for AI: it shines for search and inside high-context environments with clear task frames, but obfuscates the power of the tools in most other cases. Fundamental barriers limit the utility of chat: Americans have low literacy rates, and combined with the blank page problem, chat will limit the value people can get from AI. Context is king: multimodal, embedded AI will replace generic chat for many jobs. Think IDEs, docs, and app-native flows that deliver value in place. Hard costs influence the interface: cost and infra realities favor user-initiated interactions now; as economics improve, proactive, background “agentic” features will grow. Notable mentions with links Poe (by Quora) is shown as a chat aggregator illustrating how many tools converge on chat as the primary interface. Notion AI is used to demonstrate higher-context chat inside documents. It's helpful, but with UX pitfalls (e.g., overwriting content and unclear "terms of the transaction"). Cursor (AI IDE) is highlighted as a high-context environment where chat + multimodal controls (browser, on‑page edits) make AI assistance more precise and useful. v0 is referenced as a multimodal design/build flow that lets users edit generated UI directly, going beyond pure chat to reduce the blank-page burden. Rabbit R1 is discussed as an alternative, voice‑forward hardware form factor pushing beyond chat, with lessons about timing, expectations, and risk. Naveen Rao (Databricks) is quoted arguing that generic chat is “the worst interface for most apps,” calling for insight delivered “at the right time in the right context.” Benedict Evans is cited for the idea that most people will experience LLMs embedded inside apps rather than as standalone chatbots, similar to how SQL is invisible in products. Jakob Nielsen is noted for the view that prompt engineering’s rise signals a UX gap, and that AI needs a Google‑level leap in usability to cross the chasm. Low literacy rates are discussed as a key limiter. Good writers tend to extract more value from chat tools.

  2. JAN 10

    Bottlenecks mental model & tool time with Zo Computer

    Eric and John discuss bottlenecks as a mental model, uncovering why constraints are leverage, not blockers. Hands-on Tool Time is with Zo Computer, a stateful, powerful, AI-enabled cloud computer. Summary In the second half of Episode 1, Eric and John tackle “bottlenecks” as a core mental model: why they limit system output, when to keep them on purpose, and how to fix the right ones without creating worse slowdowns. They share examples from product development, content quality control at scale, and how the youngest child changes family life. In Tool Time, they go hands-on with Zo Computer, an AI-enabled cloud computer with state, plus agents and a real file system. Eric shares his screen to explore use cases like media management, hybrid search over local files, and remote development, ultimately questioning where the day-to-day value beats existing tools. Eric analyzes his entire history of blog post markdown files, and they conclude that running AI against physical files will be a big deal, but wonder if Zo is the right form factor. Key takeaways Mental model: bottlenecks Identify the real constraint and keep good bottlenecks: Focus on the true bottleneck, not the noisiest part. Optimizing fast stages is wasted effort. Some constraints (security, editorial review) protect quality and safety, so preserve them intentionally. Fewer focused people beat swarm tactics: Small, targeted groups resolve bottlenecks faster than all-hands pile-ons. Prototype fast, still ship with specs: High-fidelity prototypes unblock product velocity, but clear specifications prevent new downstream bottlenecks. Tool Time with Zo Computer Save long-running AI work as real artifacts: Working against files and services with memory beats transient chats when your work is long-running or spans multiple sessions. Files beat context windows: Hybrid search over a real file system is faster and more precise than stuffing giant context windows. What uses cases the remote AI computer will really solve: Tools like Zo seem well suited when it beats local workflows on security (code/data never leaves a controlled environment), scalable compute (beefy GPUs/CPU on demand), or collaborative persistence (shared stateful workspaces, services, and logs that multiple people and agents can access). Notable mentions with links Mental model: bottlenecks The Great Mental Models is a book series by Shane Parrish that breaks down fundamental decision-making through Charlie Munger’s latticework of mental models. The Goal is a business novel by Eliyahu M. Goldratt that popularizes the Theory of Constraints and introduces the “Herbie” Boy Scout hike as a vivid metaphor for bottlenecks. The Phoenix Project is an IT/DevOps retelling of The Goal that applies the Theory of Constraints to modern software delivery and operations. The Trans-Siberian Railway is used in The Great Mental Models to show how relieving one constraint in a massive project can trigger new ones elsewhere. Vercel’s v0 is an AI-assisted tool for generating websites and apps that shrinks the prototyping gap and increases product velocity and fidelity. Tools and AI Raycast is a next‑gen Mac launcher in the Spotlight/Alfred lineage that sparked a thought experiment about OS-level AI with rich local context and access. Alfred is an earlier Mac power-user launcher that provides historical context for Raycast’s approach to extensible search and commands. Zo Computer is a persistent cloud computer with memory, storage, agents, services, and a real file system that the hosts tested for Plex, blog analysis, and remote development. ... (Read more at the episode page)

    1 hr
  3. JAN 4

    The Inner Ring & creating an AI startup on demand

    Eric and John invent “Lobe,” a screenless AI for dream capture, then unpack C.S. Lewis’s “Inner Ring” to explore status, AI FOMO, and the long game of craft, character, trust, and defining “enough.” Summary Eric and John kick off the inaugural episode of Token Intelligence with a live AI startup creation challenge. Responding to John’s prompt, Eric imagines “Lobe,” a screenless AI device for passive sleep listening that reconstructs and interprets your dreams. Charting a course to more serious waters, the hosts pivot to C.S. Lewis’s “Inner Ring,” an 80-year-old college commencement speech, to unpack status, belonging, and career ambition in tech. They connect Lewis’s warning to today’s AI FOMO, contrasting short‑game inner-ring chasing with the long‑game path of craftsmanship, character, trust, and defining “enough” in work and life. Along the way, they share candid stories of startups, inner circles at school and work, and practical ways to stay curious without getting swept up in AI hype. Key takeaways Live-creating an AI startup called Lobe: A screenless, passive sleep-listening device that records during REM, blends audio with biometrics, reconstructs your dream, and offers paid interpretations—with optional visualizations via generative video tools. The Inner Ring college commencement speech: C.S. Lewis’s warning, that chasing insider status “will break your heart,” maps to modern tech careers where influence, visibility, and belonging can overshadow the work itself. Short game vs long game: Inner-ring-chasing can move titles fast, but the durable path is craftsmanship + character → trust → meaningful opportunities and friendship. Define “enough”: If freedom and time with loved ones are the goals, you can often change life structures now rather than deferring everything to a future exit or windfall. Managing AI FOMO: Name it, keep simple systems to stay current, study fundamentals (economics, incentives), and build small projects to demystify the tech without drowning in hype. Notable mentions with links Startup riff: inventing “Lobe” (screenless, passive listening AI) Sleep tracking apps like Sleep Cycle are referenced as prior art for nighttime audio capture and sleep analysis, inspiring Lobe’s focus on REM-triggered recording. Eric mistakenly referred to this a "Sleep Score" in the show. Eight Sleep is mentioned as a potential smart-mattress integration partner within the broader sleep-tech ecosystem. Sora is cited as a generative video tool that could visualize reconstructed dreams as shareable clips, extending Lobe’s premium features. Career and culture: C.S. Lewis, inner circles, and the craft The Inner Ring is a commencement speech given by C.S. Lewis at King’s College, University of London, in 1944. War and Peace, by Leo Tolstoy, is quoted in The Inner Ring to illustrate the existence of informal “unwritten systems” that shape real power and belonging. The “Pie Theory” of career success: Performance, Image, and Exposure are discussed as a common framework for how people advance inside organizations. The Staff Engineer career path is highlighted as an individual-contributor track that rewards deep expertise and influence without requiring a move into management. Personal startup journeys and ecosystems The Iron Yard is referenced as a coding school startup experience that exposed the host to founder networks, fundraising, and an eventual exit. Zappos and Tony Hsieh are mentioned in the context of a founder lunch and talent pipeline discussions during that startup phase. ... (Read more at the episode page)

    1h 50m

About

Two friends break down AI, technology, and entrepreneurship through mental models, real-world experience and the pursuit of a life well-lived.