AI Builders

Front Lines

GTM conversations with founders building the future of AI.

  1. How TwelveLabs sells AI to federal agencies: Mission alignment over process optimization | Jae Lee

    2025/10/15

    How TwelveLabs sells AI to federal agencies: Mission alignment over process optimization | Jae Lee

    TwelveLabs is building purpose-built foundation models for video understanding, enabling enterprises to index, search, and analyze petabytes of video content at scale. Founded by three technical co-founders who met in South Korea's Cyber Command doing multimodal video understanding research, the company recognized early that video requires fundamentally different infrastructure than text or image AI. Now achieving 10x revenue growth and serving customers across media, entertainment, sports, advertising, and federal agencies, TwelveLabs is proving that category creation through extreme focus beats trend chasing. In this episode, Jae Lee shares how the company navigated early product decisions, built specialized GTM motions for established industries, and maintained technical conviction during years of building in relative obscurity. Topics Discussed: How military research in multimodal video understanding led to founding TwelveLabs in 2020 The technical thesis: why video deserves purpose-built foundation models and inference infrastructure Targeting video-centric industries where ROI justifies early-stage pricing: media, entertainment, sports, advertising, and defense Partnership-driven distribution strategy and AWS Bedrock integration results Specialized sales approach: generalist leaders, vertical-specific AEs and solutions architects Maintaining extreme focus and avoiding hype cycles during the first three years of building Federal GTM lessons: why In-Q-Tel partnership and authentic mission alignment matter more than process optimization The discipline of saying no to large opportunities that don't fit ICP Keeping hiring bars high when the entire team is underwater GTM Lessons For B2B Founders: Hire vertical specialists on the front lines, not just at the top: TwelveLabs structures its GTM team with generalist leaders (head of GTM and VP of Revenue) who can sell any technology, but vertical-specialized AEs, solutions architects, and deployment engineers. These front-line team members come directly from the four target industries and understand customer workflows, buying patterns, and integration points without ramp time. Infrastructure plays require integration partnerships, not displacement: In established industries with layered technology stacks, positioning as foundational infrastructure demands partnership-first distribution. Jae explained their approach: integration with media-specific GSIs, media asset management platforms, and cloud providers ensures TwelveLabs fits into existing workflows rather than forcing wholesale replacement. Extreme focus on first-principles product development beats fast-follower tactics: While competitors built quick demos by wrapping existing models, TwelveLabs spent three years building proprietary video foundation models and indexing infrastructure from scratch. Jae was explicit about the cost: "It was painful journey in the first like two and a half, three years because folks are flying by." The payoff came from solving actual customer problems—indexing 2 million hours of content in two days, enabling semantic search at scale, building agent workflows for specific use cases. Federal requires cultural alignment before GTM optimization: TwelveLabs' federal success stems from authentic mission alignment, not just process execution. With In-Q-Tel as an investor providing interface to agencies and founders with military backgrounds, the company established credibility through shared values rather than sales tactics. ICP discipline protects product focus and team morale: Saying no to large early opportunities that don't fit ICP is operationally painful but strategically essential. Jae acknowledged the difficulty: "Early on saying no to customers is hard... as a founder you want to grow your business and you know that's going to be good for the morale. But that's only true when the customers are actually their ideal customers."

    22分
  2. How Freeplay built thought leadership by triangulating insights across hundreds of AI implementations | Ian Cairns

    2025/10/15

    How Freeplay built thought leadership by triangulating insights across hundreds of AI implementations | Ian Cairns

    Freeplay AI emerged from a precise timing insight: former Twitter API platform veterans Ian Cairns and Eric Schade recognized that generative AI created the same platform opportunity they'd previously captured with half a million monthly active developers. Their company now provides the observability, evaluation, and experimentation infrastructure that lets cross-functional teams—including non-technical domain experts—collaborate on AI systems that need to perform consistently in production. Topics Discussed: Systematic customer discovery: 75 interviews in 90 days using jobs-to-be-done methodology to surface latent AI development pain pointsCross-functional AI development: How domain experts (lawyers, veterinarians, doctors) became essential collaborators when "English became the hottest programming language"Production AI reliability challenges: Moving beyond 60% prototype success rates to consistent production performanceEnterprise selling to technical buyers: Why ABM and content worked where ads and outbound failed for VPs of engineeringCategory creation without precedent: Building thought leadership through triangulated insights across hundreds of implementationsOffline community building: Growing 3,000-person Colorado AI meetup with authentic "give first" approach GTM Lessons For B2B Founders: Structure customer discovery with jobs-to-be-done rigor: Ian executed a systematic 75-interview program in 90 days, moving beyond surface-level feature requests to understand fundamental motivations. Using Clay Christensen's framework, they discovered engineers weren't just frustrated with 60% AI prototype reliability—they were under career pressure to deliver AI wins while lacking tools to bridge the gap to production consistency. This deeper insight shaped Freeplay's positioning around professional success metrics rather than just technical capabilities.Exploit diaspora networks from platform companies: Twitter's developer ecosystem became Ian's customer research goldmine. Platform company alumni have uniquely valuable networks because they previously interfaced with hundreds of technical teams. Rather than cold outreach, Ian leveraged existing relationships and warm introductions to reach heads of engineering who were actively experimenting with AI. This approach yielded higher-quality conversations and faster pattern recognition across use cases.Target sophistication gaps in technical buying committees: Traditional SaaS tactics failed because Freeplay's buyers—VPs of engineering at companies building production AI—weren't responsive to ads or generic outbound. Instead, Ian invested in deep technical content (1500-2000 word blog posts), speaking engagements, and their "Deployed" podcast featuring practitioners from Google Labs and Box. This approach built credibility with sophisticated technical audiences who needed education about emerging best practices, not product demos.Build authority through cross-portfolio insights: Rather than positioning as AI experts, Ian built trust by triangulating learnings across "hundreds of different companies" and sharing pattern recognition. Their messaging became "don't just take Freeplay's word for it—here's what we've seen work across environments." This approach resonated because no single company had enough AI production experience to claim definitive expertise. Aggregated insights became more valuable than individual case studies.Time market entry for the infrastructure adoption curve: Ian deliberately positioned Freeplay for companies "3, 6, 12 months after being in production" rather than competing for initial AI experiments. They recognized organizations don't invest in formal evaluation infrastructure until they've proven AI matters to their business. This patient approach let them capture demand at the moment companies realized they needed serious operational discipline around AI systems.

    28分
  3. How Cerebrium generated millions in ARR through partnerships without a sales team | Michael Louis

    2025/09/29

    How Cerebrium generated millions in ARR through partnerships without a sales team | Michael Louis

    Cerebrium is a serverless AI infrastructure platform orchestrating CPU and GPU compute for companies building voice agents, healthcare AI systems, manufacturing defect detection, and LLM hosting. The company operates across global markets handling data residency constraints from GDPR to Saudi Arabia's data sovereignty requirements. In a recent episode of Category Visionaries, I sat down with Michael Louis, Co-Founder & CEO of Cerebrium, to explore how they built a high-performance infrastructure business serving enterprise customers with high five-figure to six-figure ACVs while maintaining 99.9%+ SLA requirements. Topics Discussed: Building AI infrastructure before the GPT moment and strategic patience during the hype cycleScaling a distributed engineering team between Cape Town and NYC with 95% South African talentPartnership-driven revenue generation producing millions in ARR without traditional sales teamsAI-powered market engineering achieving 35% LinkedIn reply rates through competitor analysisTechnical differentiation through cold start optimization and network latency improvementsRevenue expansion through global deployment and regulatory compliance automation GTM Lessons For B2B Founders: Treat go-to-market as a systems engineering problem: Michael reframed traditional sales challenges through an engineering lens, focusing on constraints, scalability, and data-driven optimization. "I try to reframe my go to market problem as an engineering one and try to pick up, okay, like what are my constraints? Like how can I do this, how can it scale?" This systematic approach led to testing 8-10 different strategies, measuring conversion rates, and building automated pipelines rather than relying on manual processes that don't scale.Structure partnerships for partner success before revenue sharing: Cerebrium generates millions in ARR through partners whose sales teams actively upsell their product. Their approach eliminates typical partnership friction: "We typically approach our partners saying like, look, you keep the money you make, we'll keep the money we make. If it goes well, we can talk about like rev share or some other agreement down the line." This removes commission complexity that kills B2B partnerships and allows partners to focus on customer value rather than internal revenue allocation conflicts.Build AI-powered competitive intelligence for outbound at scale: Cerebrium's 35% LinkedIn reply rate comes from scraping competitor followers and LinkedIn engagement, running prospects through qualification agents that check funding status, ICP fit, and technical roles, then generating personalized outreach referencing specific interactions. "We saw you commented on Michael's post about latency in voice. Like, we think that's interesting. Like, here's a case study we did in the voice space." Position infrastructure as revenue expansion, not cost optimization: While dev tools typically focus on developer productivity gains, Cerebrium frames their value proposition around market expansion and revenue growth. "We allow you to deploy your application in many different markets globally... go to market leaders love us and sales leaders because again we open up more markets for them and more revenue without getting their tech team involved." Weaponize regulatory complexity as competitive differentiation: Cerebrium abstracts data sovereignty requirements across multiple jurisdictions - GDPR in Europe, data residency in Saudi Arabia, and other regional compliance frameworks. "As a company to build the infrastructure to have data sovereignty in all these companies and markets, it's a nightmare." By handling this complexity, they create significant switching costs and enable customers to expand internationally without engineering roadmap dependencies, making them essential to sales teams pursuing global accounts.

    25分
  4. How OpenInfer discovered unexpected government traction by focusing on data ownership pain points | Behnam Bastani

    2025/09/16

    How OpenInfer discovered unexpected government traction by focusing on data ownership pain points | Behnam Bastani

    OpenInfer addresses the enterprise infrastructure gap that causes 70% of edge AI deployments to fail. Founded by system architects who previously built high-throughput runtime systems at Meta (enabling VR applications on Qualcomm chips via Oculus Link) and Roblox (scaling real-time operations across millions of gaming devices), OpenInfer applies proven architectural patterns to enterprise edge AI deployment. The company targets three specific customer pain points: cost reduction for AI-always-on applications, data sovereignty requirements in regulated environments, and reliability for systems that must function regardless of connectivity. In this episode of Category Visionaries, CEO and Founder Behnam Bastani reveals how external market catalysts like DeepSeek's efficiency breakthrough transformed investor perception and validated their compute optimization thesis. Topics Discussed: System architecture pattern replication from Meta's Oculus Link to Roblox to OpenInferThe compute efficiency gap: why "throwing hardware" at AI problems creates market inefficienciesHow DeepSeek's January 2025 breakthrough shifted investor sentiment from skepticism to oversubscriptionCustomer targeting methodology: focusing on business unit leaders facing career consequencesGovernment market discovery: air-gapped environments and data sovereignty requirementsTechnical demonstration strategies for overcoming the 70% edge deployment failure ratePrivacy-first AI positioning unlocking previously inaccessible use cases GTM Lessons For B2B Founders: Target decision-makers with career-level consequences: Rather than pursuing prospects who might "take a risk," Behnam focuses on "those that lose their jobs if they're not solving the problem" - specifically business unit leaders whose profit margins or sales metrics directly impact their career trajectory. This creates urgency that comfortable cloud users lack and accelerates deal cycles by aligning solution adoption with personal survival incentives.Leverage external market catalysts for thesis validation: OpenInfer initially faced investor pushback ("Nvidia's got everything working well. Why you think you can do anything better?") until DeepSeek's efficiency breakthrough provided third-party validation. "January hits and then there's DeepSeek... People called us, hey, you're DeepSeek on edge." Founders should identify potential external events that could validate their contrarian thesis and be prepared to capitalize when these catalysts occur.Lead with technical proof points over explanations: In markets with high failure rates, demonstrations eliminate skepticism faster than education. "We definitely have metrics, demos, and we go with those. We demonstrate what's possible... we remove this skepticalism in terms of ease of deployments, power of edge in one shot." This approach recognizes that technical buyers need confidence before curiosity.Pursue unexpected traction sources aggressively: Despite targeting enterprise ISVs, government demand emerged due to air-gapped environment requirements. "Government is actually becoming huge traction primarily because data ownership was a major topic to them." Rather than forcing initial market hypotheses, founders should redirect resources toward segments showing organic product-market fit signals, even when they require different sales processes.Build credibility through architectural pattern repetition: Investors backed OpenInfer because "we are the people that have built this twice, scaled it to millions." Repeating proven technical patterns across different contexts creates sustainable competitive advantages that new entrants cannot replicate without similar experience depth.

    21分
  5. How Hamming AI accidentally created a new category by focusing on customer problems instead of category creation | Sumanyu Sharma ($3.8M Raised)

    2025/09/12

    How Hamming AI accidentally created a new category by focusing on customer problems instead of category creation | Sumanyu Sharma ($3.8M Raised)

    Hamming AI has emerged as a pioneer in voice agent quality assurance, creating what founder Sumanyu Sharma calls a "new category" of QA for conversational voice agents. After spending a decade building data products at scale at companies like Tesla and Citizen, Sharma recognized an acute pain point as voice agents began proliferating: enterprises desperately needed confidence that their voice agents would work reliably before launching to production. In this episode of Category Visionaries, Sharma shares how his team accidentally created a new category by following their instincts and leveraging a decade of expertise in reliability testing, audio processing, and machine learning. Topics Discussed: The evolution from Tesla's data science team to founding a voice agent QA companyHow "wandering the desert" for months led to finding the perfect problem-solution fitBuilding a completely inbound-driven go-to-market strategy in an emerging categoryThe decision to launch before feeling ready and building alongside customersWhy the voice agent market skeptics were wrong about market sizeCreating enterprise trust through reliability testing at scale GTM Lessons For B2B Founders: Follow your instincts when you have deep domain expertise: Sharma spent months "wandering the desert" looking for the right problem until voice agent QA clicked. He emphasizes that when you have a decade of relevant expertise, you can recognize the perfect problem when it appears. As he put it, "when you see it, you kind of know... I am perfectly equipped to solve this specific problem. I'm built for this." Founders should trust their instincts when they have genuine domain expertise rather than overthinking market validation.Build something people want before focusing on category creation: Unlike many founders who start with category creation in mind, Hamming AI "accidentally" created their category by obsessively solving customer problems. Sharma notes, "We weren't looking to create a category. We were just looking to solve a problem that we feel passionate about, that we are already experts at." This customer-first approach led to organic category emergence and sustainable demand.Launch before you feel ready and build with customers: Sharma's biggest learning was launching with a "half-baked" product rather than perfecting it in isolation. "We didn't have a product that we thought was incredible. We just thought, hey, it kind of works, but let's actually build the product together with customers." This approach accelerated learning cycles and created stronger product-market fit than months of internal development would have achieved.Leverage contrarian insights from deep market proximity: While others dismissed voice agent QA as "too small," Sharma's data science background and proximity to builders gave him conviction. He analyzed the fundamentals: "Voice is a universal API for people. Voice agents are just becoming possible. They will be unreliable. Therefore, testing is very important. That's the math." Founders should develop conviction through first-principles thinking rather than consensus market opinions.Focus obsessively on customer success over marketing in emerging categories: Hamming AI remains completely inbound-driven, focusing entirely on making existing customers successful rather than traditional marketing. Sharma explains, "The voice space is so small where if you are doing a good job and if you build a product that people love, they will tell their friends about it." In nascent categories, product excellence and word-of-mouth can be more effective than broad marketing campaigns.

    20分
  6. How Nevermined coined "AI commerce" in 2023 to create category language before market adoption | Don Gossen

    2025/09/11

    How Nevermined coined "AI commerce" in 2023 to create category language before market adoption | Don Gossen

    Nevermined is pioneering the infrastructure for AI commerce, building payment rails specifically designed for agent-to-agent transactions. With a vision of trillions of AI agents functioning as both merchants and consumers, Don Gossen brings 20 years of AI experience to solving what he believes will be the foundational payment challenge of the next era of computing. In this episode of Category Visionaries, Don shares insights on creating an entirely new category—AI commerce—and the unique go-to-market challenges of building for a future that's rapidly becoming reality. Topics Discussed: The emergence of two distinct agent modalities: agent as proxy and agent as independent economic actorWhy existing payment infrastructure cannot handle the scale and velocity of AI agent transactionsNevermined's commission-based business model focused on agent-to-agent paymentsThe fundamental cost model differences between SaaS and AI agentsCreating the "AI commerce" category and the strategic importance of early categorizationGo-to-market strategy targeting verticalized AI agent builders with Series A+ fundingThe infrastructure investment phase versus deployment challenges in AI adoption GTM Lessons For B2B Founders: Target customers who have proven business models, not just potential: Don's go-to-market strategy specifically targets AI agent companies that have raised Series A or later rounds. His reasoning: "Hopefully the VCs that are backing them have done some due diligence. And the money they're earning is actually real." Rather than chasing every potential customer, focus on those who have already validated their revenue model and can immediately benefit from your solution.Understand the fundamental cost structure of your customer's business model: Don identified that AI agents have an inverted cost model compared to traditional SaaS—most costs are operational (OpEx) rather than capital (CapEx). He explains: "The cost model is basically flipped. Most of your cost is actually on the opex... Your operating costs fluctuate based on the request." This insight shaped Nevermined's entire value proposition around cost monitoring and settlement rather than just payment processing.Create category language early, even before market adoption: Don coined "AI commerce" in 2023 when "people were like, what the hell's an AI agent?" His approach: "It always helps to categorize and provide language that's going to allow people to understand what it is that you're talking about... It's the memeification of the category." Don't wait for your market to mature—create the vocabulary that will define it.Focus on the operational reality, not the theoretical use case: While competitors focus on connecting bank accounts to AI agents for consumer purchases, Don focuses on the underlying workflow costs: "How much does the workflow cost to actually render that outcome?" Understanding the true operational mechanics of your customers' business—not just their surface-level needs—can create significant competitive differentiation.Leverage deep domain expertise to identify non-obvious problems: Don's 20 years in AI revealed that variable AI agent responses create variable operational costs—a problem most founders wouldn't recognize. He notes: "Until recently most people didn't realize that is a major issue in operating these solutions." Deep industry experience can help you spot problems that newer entrants miss entirely.

    18分
  7. Why Typedef starts go-to-market activities during the design partner phase instead of after | Kostas Pardalis ($5.5M Raised)

    2025/08/19

    Why Typedef starts go-to-market activities during the design partner phase instead of after | Kostas Pardalis ($5.5M Raised)

    Typedef is building an inference-first data engine designed for the new era of AI agents and machine-to-machine interactions. With $5.5 million in funding, the company is reimagining data infrastructure for a world where both humans and AI systems need seamless access to data processing capabilities. In this episode of Category Visionaries, I sat down with Kostas Pardalis, Co-Founder & CEO of Typedef, to explore how the company is addressing the fundamental shift from traditional business intelligence platforms to AI-native data infrastructure that treats inference as a first-class citizen alongside traditional compute resources. Topics Discussed: Typedef's vision for inference-first data infrastructure in the AI eraThe transition from human-only to machine-to-machine data interactionsWhy infrastructure companies take longer to reach revenue but build deeper moatsThe evolution from pre-AI data platforms to AI-native solutionsDesign partner strategies for infrastructure companiesGo-to-market approaches that combine bottom-up (engineers) and top-down (decision makers) strategiesCategory creation challenges in rapidly evolving AI marketsThe importance of open source and education in developer-focused go-to-market GTM Lessons For B2B Founders: Start go-to-market activities during the design partner phase: Kostas emphasized that go-to-market isn't something you switch on after product development. "It's okay to go out there and talk about something that it's not very well defined or it might change, but actually it doesn't matter... go to market like just like everything else, it's an interactive process." B2B founders should begin building awareness, creating content, and engaging with potential customers even while their product is still evolving.Design partners must have real pain, not just time: The biggest insight about design partnerships is treating them like real customer relationships. "A design partner is still someone who has a problem that needs to be solved... no one is just donating their time out there... There still has to be value there." Don't approach design partnerships as charity work - ensure there's genuine mutual value exchange where your solution addresses real business pain.Product-market fit requires both product AND market innovation: Kostas challenged the common engineering mindset about product-market fit: "Many times, especially engineers, think that when we say product, market fit is that we have market, which is a static thing and we just need to iterate over the product until we find the right thing that matches exactly the market. No, that's not right." B2B founders must innovate on both the product and go-to-market sides simultaneously, including defining their target vertical and building appropriate sales motions.Infrastructure sales require dual-persona strategies: When selling to developers and technical infrastructure, you need both bottom-up and top-down approaches. "Even if you go to the manager and they love what you are saying, you still have to convince the engineers to use this thing... And they have a lot of leverage and vice versa." The bottom-up motion involves open source adoption and education, while the top-down involves traditional outbound sales to decision makers.Category creation doesn't guarantee category dominance: Having witnessed category creation firsthand, Kostas shared that defining a category doesn't ensure winning it. "It doesn't necessarily mean that because you define the categories that you are going to win at the end... Vercel was not actually the company that invented the category there." Focus on solving real problems and building sustainable competitive advantages rather than just being first to market with category messaging.

    28分
  8. How Personal AI scales enterprise contracts by selling to COOs and business users first | Suman Kanuganti ($16M Raised)

    2025/08/19

    How Personal AI scales enterprise contracts by selling to COOs and business users first | Suman Kanuganti ($16M Raised)

    Personal AI is pioneering the next generation of artificial intelligence with their memory-first platform that creates personalized AI models for individuals and organizations. Having raised over $16 million, the company has evolved from targeting consumers to focusing on enterprise customers who need highly private, precise, and personalized AI solutions. In this episode of Category Visionaries, we sat down with Suman Kanuganti, CEO and Co-Founder of Personal AI, to explore the company's journey from early AI experimentation in 2015 to building what he envisions as the future AI workforce for enterprise organizations. Topics Discussed: Personal AI's evolution from consumer-focused to enterprise B2B platformThe technical architecture behind personal language models vs. large language modelsPrivacy-first approach and competitive advantages in regulated industriesGo-to-market pivot and scaling from small law firms to enterprise contractsUnit economics advantages and 10x cost reduction compared to traditional LLMsVision for AI workforce integration in public companies within 3-5 years GTM Lessons For B2B Founders: Recognize when market timing doesn't align with your vision: Suman's team was building AI solutions as early as 2015, nearly a decade before the ChatGPT moment. When ChatGPT launched in November 2022, Personal AI faced confusion from investors and customers about their differentiation. Rather than forcing their sophisticated personal AI models on consumers who wanted simpler solutions, they recognized the market mismatch and pivoted. B2B founders should be prepared to adjust their go-to-market approach when market readiness doesn't match their technical capabilities, even if their technology is superior.Find your wedge in enterprise through specific pain points: Personal AI discovered their enterprise entry point by targeting "highly sensitive use cases that LLMs are not good for" where companies would be "shit scared to put any data in the LLM." They focused on precision and privacy pain points that large language models couldn't address. B2B founders should identify specific enterprise pain points where their solution provides clear advantages over existing alternatives, rather than trying to be everything to everyone.Let customer expansion drive revenue growth: Personal AI's enterprise strategy evolved organically as existing contracts "started growing like wildfire as more people had a creative mindset to solve the problem with the platform." They discovered that their Persona concept allowed enterprises to consolidate multiple AI use cases into one platform. B2B founders should design their platforms to naturally expand within organizations and reduce vendor fragmentation, creating stickiness and increasing average contract values.Leverage architectural advantages for unit economics: By positioning their personal language models between customer use cases and large language models, Personal AI achieved "10x lower cost" per token. This architectural decision created both privacy benefits and economic advantages. B2B founders should consider how their technical architecture can create sustainable competitive advantages in both functionality and economics, not just features.Geography matters more than you think for fundraising: Suman identified his biggest fundraising mistake as not moving to San Francisco earlier, stating "back in 2022 or 2023 is when I should have moved to San Francisco, period." He learned that being part of the Silicon Valley ecosystem and conversation is critical for fundraising success. B2B founders should consider the strategic importance of physical presence in key markets, especially when raising capital, and not underestimate the value of in-person relationship building.

    25分

番組について

GTM conversations with founders building the future of AI.