CTO2CTO

Forte Group

CTO2CTO features engaging conversations between Chief Technology Officers (CTOs). Each episode contains insights on technology trends, innovation and software development.

  1. Throwaway Code & Token Economics: The New ROI of IT Operations

    APR 27

    Throwaway Code & Token Economics: The New ROI of IT Operations

    In this episode, Lucas Hendrich, CTO at Forte Group, sits down with Garrett Fitzgerald, CIO at Salute, to explore how AI is transforming businesses that haven't traditionally been "tech first," from family-owned manufacturing companies to data center services. "The things that cost $100 million at GE 15 years ago can now be done at a 15-person manufacturing business." Garrett's approach to value creation in lower middle market companies starts with upgrading the basics: Microsoft licensing, cloud storage, Azure services. But with AI tools like Claude, he can now deploy enterprise-grade solutions, like configuring Intune for a 30-person company—in two days instead of two months. The foundation has never been cheaper or faster to build. "You need your editorial AI agents, your fact-checking AI agents. Break down roles and tasks into multiple agents." The mental model that AI is a "magic black box" producing perfect outputs from generic inputs is the most overrated aspect of AI today. Garrett argues that implementing AI agents should follow the same structures and processes used for humans. News organizations don't let reporters publish directly to the Wall Street Journal: there's an editorial process. AI systems need the same rigor. "Every company needs an open claw strategy. Not tomorrow, but you need to be experimenting now." Within a few years, Garrett predicts it'll be normalized to have group chats where 60% of participants are agents. But that requires solving identity management first. Agents need distinct identities: their own Notion accounts, Gmail addresses, Slack profiles, not fake personas pretending to be human. The security infrastructure for this doesn't fully exist yet, but it's coming. "Focus too much on ROI per use case, and you're missing the forest for the trees." Garrett separates AI investments into two categories: (1) LLM-based process automation, where token spend vs. ROI matters immediately, and (2) personal productivity and general-purpose agents, where you need a longer time horizon. The second category requires investment like any training program: you won't see quarterly returns, but year-over-year you'll double your business with flat headcount. "People go, 'I'll just put the LLM in, it'll be better than a rules engine.' But you need to break that mental model." One of Garrett's most practical insights: use AI to build comprehensive rules-based systems, not to replace them with LLMs running on every transaction. LLMs can design, build, update, and monitor deterministic processes, but they don't need to be in the execution path burning tokens. The old mental model that rules engines are hard to build no longer applies. "In a year, the distinction between CTO, CIO, and Chief Digital Officer doesn't matter. The job's always been the same." As AI democratizes technical capability, leadership roles converge. The "ivory tower" IT leader is gone. You have to be both a great leader and an active builder, using AI tools at a granular level, not just strategically. If you're a CFO who's never done a VLOOKUP, how can you lead a finance team? If you're a CTO who hasn't done the prompting, how can you lead technology?

    1h 14m
  2. Modesty in Tech: Lessons Learned from Building Platforms at NASA

    APR 16

    Modesty in Tech: Lessons Learned from Building Platforms at NASA

    In this episode, Lucas Hendrich talks with Eashwer Srinivasan, CTO at Sonny’s Car Wash, about what happens when software stops being a product and becomes operational infrastructure. The conversation spans Eashwer’s early experience at NASA, the realities of scaling platforms before cloud computing, and why industries that appear simple on the surface often require the most complex systems behind the scenes. ‍ Episode Takeaways “Working at NASA is a humbling experience because tech is like electricity.” At NASA, technology teams existed to support the mission, not to be the focus of it. Scientists and astronauts cared about whether systems worked reliably, not what platform was built. That environment shaped a mindset where reliability and delivery mattered more than architectural elegance. “We went live around midnight… six o’clock in the morning, Columbia crashed… the site stood there and took hundreds of millions of hits.” A platform launch was immediately tested by a real-world crisis. The public rushed to NASA’s site for information, and the system had to scale instantly, without cloud infrastructure. It became a defining lesson in operational readiness and resilience. “Use AI to assist, to help, not just to fully write.” At Sonny’s, generative AI is used in the software development lifecycle, particularly for generating test cases and improving testing coverage. The team saw efficiency gains but also learned that fully AI-generated code created maintenance challenges, reinforcing the need for human oversight. “You walk into most car washes and the point-of-sale system has probably been there for 20 years. It works.” Legacy systems persist in operational businesses because reliability matters more than novelty. Modernization requires integrating new capabilities: customer engagement, reporting, and automation, without disrupting existing operations. “Think about predictive maintenance… vibrations, water pressure… and computer vision so cars don’t collide.” Modern car wash operations rely on software controlling real-world processes. Systems monitor equipment health, guide maintenance decisions, and ensure safety inside automated tunnels. “As part of onboarding, you go to the Car Wash College: entire tunnels set up to learn how everything works.” Understanding the physical environment is essential. Engineers working on the platform must understand machinery, operators, and workflows because the software directly interacts with real-world equipment and customers.

    47 min
  3. Invention vs. Innovation: Lessons from VMware’s vMotion

    MAR 26

    Invention vs. Innovation: Lessons from VMware’s vMotion

    In this episode, Lucas Hendrich, CTO at Forte Group, sits down with Kit Colbert, Platform CTO at Invisible Technologies and former CTO at VMware, to explore the difference between building something new and creating real impact, and why that distinction matters more than ever in the age of AI. "Invention is about something new. Innovation is about impact." After spending nearly 20 years at VMware, including pioneering work on vMotion and Storage vMotion, Kit learned this lesson the hard way. He built Storage vMotion largely on his own, a technology that could turn multi-month, multi-million-dollar storage migrations into a simple drag-and-drop operation. But without marketing, sales enablement, and tech support aligned, the invention would've created zero innovation. The "red tape" he tried to avoid was actually the machinery that creates customer value. "By the end of 2026, AI agents will be able to run for a week on their own." Today's frontier is around 12 hours of autonomous operation. Kit predicts that within months, you'll be able to give an AI agent a complex project plan on Monday and check back in the following Monday, just like managing a capable human team member. This shift will fundamentally change how engineering teams are structured and how work gets orchestrated. "The two-pizza team is going away. We're moving to half-pizza teams." When developers have 5-8 AI agents working in parallel, the traditional team size becomes obsolete. Kit sees a future where 2-3 people manage 20-30 agents in aggregate, creating enormous pressure on product management and task orchestration. The challenge isn't writing code: it's managing the volume of code being generated and ensuring it creates real value. "We're already assuming a developer will be twice as expensive by year-end because of token costs." Kit's CFO is planning for engineers to consume a full salary's worth of tokens annually. Rather than limiting consumption, the focus is on smart usage. If spending $6,000/month on AI tools makes a developer 10x more productive, that's a bargain. The economics are shifting rapidly, and organizational planning needs to catch up. "Two-thirds of software changes don't make any measurable difference or make things worse." Drawing on Microsoft Research findings, Kit emphasizes that being wrong is normal in software development. The key is limiting effort per experiment and closing feedback loops quickly. This insight, central to the Agile revolution, becomes even more critical when AI can generate massive amounts of code. Speed without validation is just expensive noise. "I learned more from failing as a GM than from any other role." Kit's journey from IC developer to managing 2,400 engineers wasn't linear. He jumped from 150 people to 15, then to 2,400. The "failure" as a GM taught him lessons about decision-making, letting go of technical control, and understanding that his value to the organization had fundamentally changed. Leading through influence scales differently than technical expertise.

    1h 2m
  4. The Actuarial Engineer: Bridging Data Science and Health Tech

    FEB 9

    The Actuarial Engineer: Bridging Data Science and Health Tech

    In this episode, Lucas Hendrich, CTO at Forte Group, sits down with Robert Stewart, CTO at Arbital Health, to explore the intersection of healthcare technology, AI, and engineering culture in one of the most complex sectors in the economy. "We created a position that I don't know if it exists anywhere else: an actuarial engineer." Rather than training engineers to understand actuarial science, Robert's team takes actuaries (experts who've passed rigorous exams and understand complex healthcare finance) and teaches them to code. With AI-powered coding tools, this unconventional approach is proving remarkably successful, allowing subject matter experts to build production features themselves. "AI makes value-based care easier to administer by removing the operational friction. Value-based care shifts incentives from volume to outcomes, but measuring those outcomes creates significant administrative burden. Arbital Health's platform ingests fragmented healthcare data, applies actuarial models, and provides clear visualizations that both payers and providers can understand, reducing friction while maintaining transparency. "I focused very much on inclusivity: about half our hackathon participants weren't from engineering or product." Over 27 hackathons at his previous company, Robert perfected a formula: open participation to all departments, reward cross-functional teams, and focus on mission-driven innovation rather than backlog work. The result? During COVID, 150 people participated in a remote hackathon that produced 11 production features, including work that led to powering vaccines.gov. "Healthcare interoperability is a major problem because of financial incentives and distribution." Unlike telecommunications, where incompatibility would be fatal, healthcare systems were often built in isolation with limited incentive to share data. Combined with legacy systems from the 70s and 80s, regulatory complexity, and genuine privacy concerns, the industry faces challenges that aren't purely technical: they're structural. "Having a mentor who understood not just engineering but finance changed my career." Early in his career, Robert worked with a leader who demonstrated that great CTOs don't just manage code: they understand the business, talk to customers, and develop empathy for what motivates each team member. That experience shaped his philosophy: organizations should operate intentionally, not by routine or comfort level.

    1 hr
  5. Disrupting the Testing Industry: AI and Infrastructure Moats

    JAN 26

    Disrupting the Testing Industry: AI and Infrastructure Moats

    In this episode, Lucas Hendrich, CTO at Forte Group, sits down with Anoop Tripathi, CTO at Sauce Labs, to discuss how AI is reshaping software development, and why quality, simplicity, and engineering judgment matter more than ever. “People don’t need a PhD in your product to use it.” For Anoop, simplicity is not a design preference, but a competitive advantage. From early platform shifts at Citrix to modern SaaS products, he has seen the same pattern repeat: tools that reduce friction consistently outperform those that add complexity. “The software industry isn’t complex because the problems are complex.” Anoop argues that much of today’s software complexity is self-inflicted. Over-engineering, layered abstractions, and unnecessary automation often make systems harder to operate, scale, and secure than they need to be. “AI is going to produce a lot more software.” As generative AI lowers the cost of writing code, the volume of software will grow exponentially. That shift makes testing and quality engineering more critical—not less. According to Anoop, AI doesn’t eliminate testing; it amplifies the need for it. “Smart engineers don’t write that much code. They fix code.” While AI can generate functional software quickly, production-ready systems still require experienced engineers to interpret outputs, refine architectures, and ensure long-term maintainability. “If you automate everything, you often lose intelligence.” Anoop cautions against blindly adopting agentic architectures. AI excels in non-deterministic problem spaces, but forcing intelligence into deterministic systems often introduces fragility rather than resilience. Stay tuned for more conversations at the intersection of autonomy, leadership, and engineering culture on CTO2CTO.

    50 min

Ratings & Reviews

5
out of 5
5 Ratings

About

CTO2CTO features engaging conversations between Chief Technology Officers (CTOs). Each episode contains insights on technology trends, innovation and software development.