The New Quantum Era - innovation in quantum computing, science and technology

Sebastian Hassinger

Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.

  1. 5D AGO

    Quantum consciousness with Joachim Keppler

    What if consciousness isn’t generated by the brain, but emerges from its interaction with a ubiquitous quantum field? In this episode, Sebastian Hassinger and theoretical physicist Joachim Keppler explore a zero‑point field model of consciousness that could reshape both neuroscience and quantum theory. SummaryThis conversation is for anyone curious about the “hard problem” of consciousness, quantum brain theories, and the future of quantum biology and AI. Joachim shares his QED‑based framework where the brain couples to the electromagnetic zero‑point field via glutamate, producing macroscopic quantum effects that correlate with conscious states. You’ll hear how this model connects existing neurophysiology, testable predictions, and deep questions in philosophy of mind. What You’ll Learn  How a quantum field theorist ended up founding an institute for the scientific study of consciousness and building a rigorous, physics‑grounded framework for it. Why consciousness may hinge on a universal principle: the brain’s resonant coupling to the electromagnetic zero‑point field, not just classical neural firing. What macroscopic quantum phenomena in the brain look like, including coherence domains, self‑organized criticality, and long‑range synchronized activity patterns linked to conscious states. How glutamate, the brain’s most abundant neurotransmitter, could act as the molecular interface to the zero‑point field inside cortical microcolumns. Which concrete experiments could confirm or falsify this theory, from detecting macroscopic quantum coherence in neurotransmitter molecules to measuring glutamate‑driven biophoton emissions with a specific quantum “fingerprint.” Why Joachim sees the zero‑point field as a dual‑aspect “psychophysical” field and how that reframes classic philosophy‑of‑mind debates about qualia and the nature of awareness. What this perspective implies for artificial consciousness and whether future quantum computers or engineered systems might couple to the field and become genuinely conscious rather than merely simulating it. How quantum biology could offer an evolutionary path for consciousness, extending field‑coupling ideas from the human brain down to simpler organisms and bacterial signaling.Resources & Links DIWISS Research Institute for the scientific study of consciousness “Macroscopic quantum effects in the brain: new insights into the neural correlates of consciousness” – Research article outlining the QED/zero‑point field model and its neurophysiological connections. “A New Way of Looking at the Neural Correlates of Consciousness” – Paper introducing the idea that the full spectrum of qualia is encoded in the zero‑point field. “The Role of the Brain in Conscious Processes: A New Way of Understanding the Neural Correlates of Consciousness” – Further develops the brain‑as‑interface, ZPF‑based frameworkHuman high intelligence is involved in spectral redshift of biophotonic activities in the brain - studies on glutamate‑linked emissions in brain tissue – Experiments that inform potential tests of the theory.Key Quotes or Insights  “The brain may not produce consciousness; it may tune into it by coupling to the zero‑point field, like a resonant oscillator accessing a universal substrate of awareness.” “Conscious states correspond to macroscopic quantum patterns in the brain—highly synchronized, near‑critical dynamics that disappear when the field coupling breaks down in unconsciousness.” “Glutamate‑rich cortical microcolumns could be the molecular gateway to the zero‑point field, forming coherence domains that orchestrate neuronal firing from the bottom up.” “If we can engineer systems that replicate this field‑coupling mechanism, we might not just simulate consciousness—we might be building genuinely conscious artificial systems.” “Quantum biology could reveal an evolutionary continuum of field‑coupling, from simple organisms to humans, reframing how we think about life, intelligence, and mind.”

    37 min
  2. FEB 2

    Quantum Leadership with Nadya Mason

    What happens when a former elite gymnast with “weak math and science” becomes dean of one of the world’s most influential quantum engineering schools? In this episode of *The New Quantum Era*, Sebastian Hassinger talks with Prof. Nadya Mason about quantum 2.0, building a regional quantum ecosystem, and why she sees leadership as a way to serve and build community rather than accumulate power. Summary  This conversation is for anyone curious about how quantum materials research, academic leadership, and large‑scale public investment are shaping the next phase of quantum technology. You’ll hear how Nadya’s path from AT&T Bell Labs to dean of the Pritzker School of Molecular Engineering at UChicago informs her service‑oriented approach to leadership and ecosystem building.  The discussion spans superconducting devices, Chicago’s quantum hub strategy, and what it will actually take to build a diverse, job‑ready quantum workforce in time for the coming wave of applications. What You’ll Learn How a non‑linear path (elite sports, catching up in math, early lab work) can lead to a career at the center of quantum science and engineering.Why condensed matter and quantum materials are the quiet “bottleneck” for scalable quantum computing, networking, and transduction technologies.How superconducting junctions, Andreev bound states, and hybrid devices underpin today’s superconducting qubits and topological quantum efforts.The difference between “quantum 1.0” (lasers, GPS, nuclear power, semiconductors) and “quantum 2.0” focused on sensing, communication, and computation.How the Pritzker School of Molecular Engineering and the Chicago Quantum Exchange are deliberately knitting together universities, national labs, industry, and state funding into a cohesive quantum cluster.Why Nadya frames leadership as building communities around science and opportunity, and what that means in a faculty‑driven environment where “nobody works for the dean.”Concrete ways Illinois and UChicago are approaching quantum education and workforce development, from REUs and the Open Quantum Initiative to the South Side Science Fair.Why early math confidence plus hands‑on research experience are the two most important ingredients for preparing the next generation of quantum problem‑solvers.Resources & Links   Pritzker School of Molecular Engineering, University of Chicago – Nadya’s home institution, pioneering an interdisciplinary, theme‑based approach to quantum, materials for sustainability, and immunoengineering.Chicago Quantum Exchange – Regional hub connecting universities, national labs, and industry to build quantum networks, workforce, and commercialization pathways.South Side Science Fair (UChicago) – Large‑scale outreach effort bringing thousands of local students to campus to encounter science and quantum concepts early.Key Quotes or Insights   “A rainbow is more beautiful because I understand the fraction behind it”—how physics deepened Nadya’s sense of wonder rather than reducing it.“In condensed matter, the devil is in the material—and the interfaces”—why microscopic imperfections and humidity‑induced “schmutz” can make or break quantum devices.“Quantum 1.0 gave us lasers, GPS, and nuclear power; quantum 2.0 is about using quantum systems to *process* information through sensing, networking, and computing.”“If you want to accumulate power, academia is not the place—faculty don’t work for me. Leadership here is about building community and creating opportunities.”“If we want to lead in quantum as a country, we have to make math skills and real lab experiences accessible early, so kids even know this world exists as an option.”Calls to Action   Subscribe to The New Quantum Era and share this episode with a colleague or student who’s curious about quantum careers and leadership beyond the usual narratives.If you’re an educator or program lead, explore ways to bring hands‑on research experiences and accessible math support into your classroom or community programs.If you’re in industry, academia, or policy, consider how you or your organization can plug into regional quantum ecosystems like Chicago’s to support training, internships, and inclusive hiring.

    46 min
  3. JAN 26

    Democratizing Quantum Venture Investing with Chris Sklarin

    Your host, Sebastian Hassinger, talks with Alumni Ventures managing partner Chris Sklarin about how one of the most active US venture firms is building a quantum portfolio while “democratizing” access to VC as an asset class for individual investors. They dig into Alumni Ventures’ co‑investor model, how the firm thinks about quantum hardware, software, and sensing, and why quantum should be viewed as a long‑term platform with near‑term pockets of commercial value. Chris also explains how accredited investors can start seeing quantum deal flow through Alumni Ventures’ syndicate. Chris’ background and Alumni Ventures in a nutshell Chris is an MIT‑trained engineer who spent years in software startups before moving into venture more than 20 years ago.Alumni Ventures is a roughly decade‑old firm focused on “democratizing venture capital” for individual investors, with over 11,000 LPs, more than 1.5 billion dollars raised, and about 1,300 active portfolio companies.The firm has been repeatedly recognized as a highly active VC by CB Insights, PitchBook, Stanford GSB, and Time magazine.How Alumni Ventures structures access for individuals Most investors come in as individuals into LLC‑structured funds rather than traditional GP/LP funds.Alumni Ventures always co‑invests alongside a lead VC, using the lead’s conviction, sector expertise, and diligence as a key signal.The platform also offers a syndicate where accredited investors can opt in to see and back individual deals, including those tagged for quantum.Quantum in the Alumni Ventures portfolio Alumni Ventures has 5–6 quantum‑related investments spanning hardware, software, and applications, including Rigetti, Atom Computing, Q‑CTRL, Classiq, and quantum‑error‑mitigation startup Qedma/Cadmus.Rigetti was one of the firm’s earliest quantum investments; the team followed on across multiple rounds and was able to return capital to investors after Rigetti’s SPAC and a strong period in the public markets.Chris also highlights interest in Cycle Dre (a new company from Rigetti’s former CTO) and application‑layer companies like InQ and quantum sensing players.Barbell funding and the “3–5 year” view Chris responds to the now‑familiar “barbell” funding picture in quantum— a few heavily funded players and a long tail of small companies—by emphasizing near‑term revenue over pure science experiments.He sees quantum entering an era where companies must show real products, customers, and revenue, not just qubit counts.Over the next 3–5 years, he expects meaningful commercial traction first in areas like quantum sensing, navigation, and point solutions in chemistry and materials, with full‑blown fault‑tolerant systems further out.Hybrid compute and NVIDIA’s signal to the market Chris points to Jensen Huang’s GTC 2025 keynote slide on NVIDIA’s hybrid quantum–GPU ecosystem, where Alumni Ventures portfolio companies such as Atom Computing, Classiq, and Rigetti appeared.He notes that NVIDIA will not put “science projects” on that slide—those partnerships reflect a view that quantum processors will sit tightly coupled next to GPUs to handle specific workloads.He also mentions a large commercial deal between NVIDIA and Groq (a classical AI chip company in his portfolio) as another sign of a more heterogeneous compute future that quantum will plug into.Where near‑term quantum revenue shows up Chris expects early commercial wins in sensing, GPS‑denied navigation, and other narrow but valuable applications before broad “quantum advantage” in general‑purpose computing.Software and middleware players can generate revenue sooner by making today’s hardware more stable, more efficient, or easier to program, and by integrating into classical and AI workflows.He stresses that investors love clear revenue paths that fit into the 10‑year life of a typical venture fund.University spin‑outs, clustering, and deal flow Alumni Ventures certainly sees clustering around strong quantum schools like MIT, Harvard, and Yale, but Chris emphasizes that the “alumni angle” is secondary to the quality of the venture deal.Mature tech‑transfer offices and standard Delaware C‑corps mean spinning out quantum IP from universities is now a well‑trodden path.Chris leans heavily on network effects—Alumni Ventures’ 800,000‑person network and 1,300‑company CEO base—as a key channel for discovering the most interesting quantum startups.Managing risk in a 100‑hardware‑company world With dozens of hardware approaches now in play, Chris uses Alumni Ventures’ co‑investor model and lead‑investor diligence as a filter rather than picking purely on physics bets.He looks for teams with credible near‑term commercial pathways and for mechanisms like sensing or middleware that can create value even if fault‑tolerant systems arrive later than hoped.He compares quantum to past enabling waves like nanotech, where the biggest impact often shows up as incremental improvements rather than a single “big bang” moment.Democratizing access to quantum venture Alumni Ventures allows accredited investors to join its free syndicate, self‑attest accreditation, and then see deal materials—watermarked and under NDA—for individual investments, including quantum.Chris encourages people to think in terms of diversified funds (20–30 deals per fund year) rather than only picking single names in what is a power‑law asset class.He frames quantum as a long‑duration infrastructure play with near‑term pockets of usefulness, where venture can help investors participate in the upside without getting ahead of reality.

    33 min
  4. JAN 19

    Regional quantum development with Alejandra Y. Castillo

    Alejandra Y. Castillo, former Assistant Secretary of Commerce for Economic Development and now Chancellor Senior Fellow for Economic Development at Purdue University Northwest, joins your host, Sebastian Hassinger, to discuss how quantum technologies can drive inclusive regional economic growth and workforce development. She shares lessons from federal policy, Midwest tech hubs, and cross-state coalitions working to turn quantum from lab research into broad-based opportunity. Themes and key insights Quantum as near-term and multi-faceted: Castillo pushes back on the idea that quantum is distant, emphasizing that computing, sensing, and communications are already maturing and attracting serious investment from traditional industries like biopharma.From federal de-risking to regional ecosystems: She describes the federal role as de-risking early innovation through programs under the CHIPS and Science Act while stressing that long-term success depends on regional coalitions across states, universities, industry, philanthropy, and local government.Inclusive workforce and supply-chain planning: Castillo argues that “quantum workforce” must go beyond PhDs to include a mapped ecosystem of jobs, skills, suppliers, housing, and infrastructure so that local communities see quantum as opportunity, not displacement.National security, urgency, and inclusion: She frames sustained quantum investment as both an economic and national security imperative, warning that inconsistent U.S. funding risks falling behind foreign competitors while also noting that private capital alone may ignore inclusion and regional equity.Notable quotes “We either focus on the urgency or we’re going to have to focus on the emergency.”“No one state is going to do this… This is a regional play that we will be called to answer for the sake of a national security play as well.”“We want to make sure that entire regions can actually reposition themselves from an economic perspective, so that people can stay in the places they call home—now we’re talking about quantum.”“Are we going to make that same mistake again, or should we start to think about and plan how quantum is going to also impact us?”Articles, papers, and initiatives mentioned America's quantum future depends on regional ecosystems like Chicago's — Alejandra’s editorial in Crain’s Chicago Business calling for sustained, coordinated investment in quantum as a national security and economic priority, highlighting the role of the Midwest and tech hubs.CHIPS and Science Act (formerly “Endless Frontier”) — U.S. legislation that authorized large-scale funding for semiconductors and science, enabling EDA’s Tech Hubs and NSF’s Engines programs to back regional coalitions in emerging technologies like quantum.EDA Tech Hubs and NSF Engines programs — Federal initiatives that fund multi-state consortiums combining universities, companies, and civic organizations to build durable regional innovation ecosystems, including quantum-focused hubs in the Midwest.National Quantum Algorithms Center — This center explores quantum algorithms for real-world problems such as natural disasters and biopharma discovery, aiming to connect quantum advances directly to societal challenges.Roberts Impact Lab at Purdue Northwest (with Quantum Corridor) – A testbed and workforce development center focused on quantum, AI, and post-quantum cryptography, designed to prepare local talent and companies for quantum-era applications.Chicago Quantum Exchange and regional partners (Illinois, Indiana, Wisconsin) – A multi-university and multi-state collaboration that pioneered a model for regional quantum ecosystems.

    32 min
  5. JAN 12

    Majorana qubits with Chetan Nayak

    In this episode of The New Quantum Era, your host Sebastian Hassinger is joined by Chetan Nayak, Technical Fellow at Microsoft, professor of physics at the University of California Santa Barbara, and driving force behind Microsoft's quantum hardware R&D program. They discuss a modality of qubit that has not been covered on the podcast before, based on Majorana fermonic behaviors, which have the promise of providing topological protection against the errors which are such a challenge to quantum computing. Guest Bio  Chetan Nayak is a Technical Fellow at Microsoft and leads the company’s topological quantum hardware program, including the Majorana‑1 processor based on Majorana‑zero‑mode qubits.  He is also a professor of physics at UCSB and a leading theorist in topological phases of matter, non‑Abelian anyons, and topological quantum computation.  Chetan co‑founded Microsoft’s Station Q  in 2005, building a bridge from theoretical proposals for topological qubits to engineered semiconductor–superconductor devices. What we talk about  Chetan’s first exposure to quantum computing in Peter Shor’s lectures at the Institute for Advanced Study, and how that intersected with his PhD work with Frank Wilczek on non‑Abelian topological phases and Majorana zero modes.  The early days of topological quantum computation: fractional quantum Hall states at , emergent quasiparticles, and the realization that braiding these excitations naturally implements Clifford gates.  How Alexei Kitaev’s toric‑code and Majorana‑chain ideas connected abstract topology to concrete condensed‑matter systems, and led to Chetan’s collaboration with Michael Freedman and Sankar Das Sarma.  The 2005 proposal for a gallium‑arsenide quantum Hall device realizing a topological qubit, and the founding of Station Q to turn such theoretical blueprints into experimental devices in partnership with academic labs.  Why Microsoft pivoted from quantum Hall platforms to semiconductor–superconductor nanowires: leveraging the Fu–Kane proximity effect, spin–orbit‑coupled semiconductors, and a huge material design space—while wrestling with the challenges of interfaces and integration.  The evolution of the tetron architecture: two parallel topological nanowires with four Majorana zero modes, connected by a trivial superconducting wire and coupled to quantum dots that enable native Z‑ and X‑parity loop measurements.  How topological superconductivity allows a superconducting island to host even or odd total electron parity without a local signature, and why that nonlocal encoding provides hardware‑level protection for the qubit’s logical 0 and 1.  Microsoft’s roadmap in a 2D “quality vs. complexity” space: improving topological gap, readout signal‑to‑noise, and measurement fidelity while scaling from single tetrons to error‑corrected logical qubits and, ultimately, utility‑scale systems.  Error correction on top of topological qubits: using surface codes and Hastings–Haah Floquet codes with native two‑qubit parity measurements, and targeting hundreds of physical tetrons per logical qubit and thousands of logical qubits for applications like Shor’s algorithm and quantum chemistry.  Engineering for scale: digital, on–off control of quantum‑dot couplings; cryogenic CMOS to fan out control lines inside the fridge; and why tetron size and microsecond‑scale operations sit in a sweet spot for both physics and classical feedback.  Where things stand today: the Majorana‑1 chiplet, recent tetron loop‑measurement experiments, DARPA’s US2QC program, and how external users—starting with government and academic partners—will begin to access these devices before broader Azure Quantum integration. Papers and resources mentionedThese are representative papers and resources that align with topics and allusions in the conversation; they are good entry points if you want to go deeper. Non‑Abelian Anyons and Topological Quantum Computation – S. Das Sarma, M. Freedman, C. Nayak, Rev. Mod. Phys. 80, 1083 (2008); Early device proposals Sankar Das Sarma, Michael Freedman, and Chetan Nayak, “Topological quantum computation,” Physics Today 59(7), 32–38 (July 2006).Roadmap to fault‑tolerant quantum computation using topological qubits – C. Nayak et al., arXiv:2502.12252. Distinct lifetimes for X and Z loop measurements in a Majorana tetron - C. Nayaak et al., arXiv:2507.08795.Majorana qubit codes that also correct odd-weight errors - S. Kundu and B. Reichardt, arXiv:2311.01779. Microsoft's Majorana 1 chip carves new path for quantum computing, Microsoft blog post

    1h 3m
  6. 12/12/2025

    Peaked quantum circuits with Hrant Gharibyan

    In this episode of The New Quantum Era, Sebastian talks with Hrant Gharibyan, CEO and co‑founder of BlueQubit, about “peaked circuits” and the challenge of verifying quantum advantage. They unpack Scott Aaronson and Yuxuan Zhang’s original peaked‑circuit proposal, BlueQubit’s scalable implementation on real hardware, and a new public challenge that invites the community to attack their construction using the best classical algorithms available. Along the way, they explore how this line of work connects to cryptography, hardness assumptions, and the near‑term role of quantum devices as powerful scientific instruments. Topics Covered Why verifying quantum advantage is hard The core problem: if a quantum device claims to solve a task that is classi-cally intractable, how can anyone check that it did the right thing? Random circuit sampling (as in Google’s 2019 “supremacy” experiment and follow‑on work from Google and Quantinuum) is believed to be classically hard to simulate, but the verification metrics (like cross‑entropy benchmarking) are themselves classically intractable at scale.What are peaked circuits? Aaronson and Zhang’s idea: construct circuits that look like random circuits in every respect, but whose output distribution secretly has one special bit string with an anomalously high probability (the “peak”). The designer knows the secret bit string, so a quantum device can be verified by checking that measurement statistics visibly reveal the peak in a modest number of shots, while finding that same peak classically should be as hard as simulating a random circuit.BlueQubit’s scalable construction and hardware demo BlueQubit extended the original 24‑qubit, simulator‑based peaked‑circuit construction to much larger sizes using new classical protocols. Hrant explains their protocol for building peaked circuits on Quantinuum’s H2 processor with around 56 qubits, thousands of gates, and effectively all‑to‑all connectivity, while still hiding a single secret bit string that appears as a clear peak when run on the device.Obfuscation tricks and “quantum steganography” The team uses multiple obfuscation layers (including “swap” and “sweeping” tricks) to transform simple peaked circuits into ones that are statistically indistinguishable from generic random circuits, yet still preserve the hidden peak.The BlueQubit Quantum Advantage Challenge To stress‑test their hardness assumptions, BlueQubit has published concrete circuits and launched a public bounty (currently a quarter of a bitcoin) for anyone who can recover the secret bit string classically. The aim is to catalyze work on better classical simulation and de‑quantization techniques; either someone closes the gap (forcing the protocol to evolve) or the standing bounty helps establish public trust that the task really is classically infeasible.Potential cryptographic angles Although the main focus is verification of quantum advantage, Hrant outlines how the construction has a cryptographic flavor: a secret bit string effectively acts as a key, and only a sufficiently powerful quantum device can efficiently “decrypt” it by revealing the peak. Variants of the protocol could, in principle, yield schemes that are classically secure but only decryptable by quantum hardware, and even quantum‑plus‑key secure, though this remains speculative and secondary to the verification use case. From verification protocol to startup roadmap Hrant positions BlueQubit as an algorithm and capability company: deeply hardware‑aware, but focused on building and analyzing advantage‑style algorithms tailored to specific devices. The peaked‑circuit work is one pillar in a broader effort that includes near‑term scientific applications in condensed‑matter physics and materials (e.g., Fermi–Hubbard models and out‑of‑time‑ordered correlators) where quantum devices can already probe regimes beyond leading classical methods.Scientific advantage today, commercial advantage tomorrow Sebastian and Hrant emphasize that the first durable quantum advantages are likely to appear in scientific computing—acting as exotic lab instruments for physicists, chemists, and materials scientists—well before mass‑market “killer apps” arrive. Once robust, verifiable scientific advantage is established, scaling to larger models and more complex systems becomes a question of engineering, with clear lines of sight to industrial impact in sectors like pharmaceuticals, advanced materials, and manufacturing.The challenge: https://app.bluequbit.io/hackathons/

    30 min
  7. 12/06/2025

    Diamond vacancies and scalable qubits with Quantum Brilliance

    Episode overviewThis episode of The New Quantum Era features a conversation with Quantum Brilliance co‑founder and CEO Mark Luo and independent board chair Brian Wong about diamond nitrogen vacancy (NV) centers as a platform for both quantum computing and quantum sensing. The discussion covers how NV centers work, what makes diamond‑based qubits attractive at room temperature, and how to turn a lab technology into a scalable product and business. What are diamond NV qubits?  Mark explains how nitrogen vacancy centers in synthetic diamond act as stable room‑temperature qubits, with a nitrogen atom adjacent to a missing carbon atom creating a spin system that can be initialized and read out optically or electronically. The rigidity and thermal properties of diamond remove the need for cryogenics, complex laser setups, and vacuum systems, enabling compact, low‑power quantum devices that can be deployed in standard environments. Quantum sensing to quantum computing  NV centers are already enabling ultra‑sensitive sensing, from nanoscale MRI and quantum microscopy to magnetometry for GPS‑free navigation and neurotech applications using diamond chips under growing brain cells. Mark and Brian frame sensing not as a hedge but as a volume driver that builds the diamond supply chain, pushes costs down, and lays the manufacturing groundwork for future quantum computing chips. Fabrication, scalability, and the value chain  A key theme is the shift from early “shotgun” vacancy placement in diamond to a semiconductor‑style, wafer‑like process with high‑purity material, lithography, characterization, and yield engineering. Brian characterizes Quantum Brilliance’s strategy as “lab to fab”: deciding where to sit in the value chain, leveraging the existing semiconductor ecosystem, and building a partner network rather than owning everything from chips to compilers. Devices, roadmaps, and hybrid nodes  Quantum Brilliance has deployed room‑temperature systems with a handful of physical qubits at Oak Ridge National Laboratory, Fraunhofer IAF, and the Pawsey Supercomputing Centre. Their roadmap targets application‑specific quantum computing with useful qubit counts toward the end of this decade, and lunchbox‑scale, fault‑tolerant systems with on the order of 50–60 logical qubits in the mid‑2030s. Modality tradeoffs and business discipline  Mark positions diamond NV qubits as mid‑range in both speed and coherence time compared with superconducting and trapped‑ion systems, with their differentiator being compute density, energy efficiency, and ease of deployment rather than raw gate speed. Brian brings four decades of experience in semiconductors, batteries, lidar, and optical networking to emphasize milestones, early revenue from sensing, and usability—arguing that making quantum devices easy to integrate and operate is as important as the underlying physics for attracting partners, customers, and investors. Partners and ecosystem  The episode underscores how collaborations with institutions such as Oak Ridge, Fraunhofer, and Pawsey, along with industrial and defense partners, help refine real‑world requirements and ensure the technology solves concrete problems rather than just hitting abstract benchmarks. By co‑designing with end users and complementary hardware and software vendors, Quantum Brilliance aims to “democratize” access to quantum devices, moving them from specialized cryogenic labs to desks, edge systems, and embedded platforms.

    37 min
  8. 11/26/2025

    Macroscopic Quantum Tunneling with Nobel Laureate John Martinis

    Episode overviewJohn Martinis, Nobel laureate and former head of Google’s quantum hardware effort, joins Sebastian Hassinger on The New Quantum Era to trace the arc of superconducting quantum circuits—from the first demonstrations of macroscopic quantum tunneling in the 1980s to today’s push for wafer-scale, manufacturable qubit processors. The episode weaves together the physics of “synthetic atoms” built from Josephson junctions, the engineering mindset needed to turn them into reliable computers, and what it will take for fabrication to unlock true large-scale quantum systems. Guest bioJohn M. Martinis is a physicist whose experiments on superconducting circuits with John Clarke and Michel Devoret at UC Berkeley established that a macroscopic electrical circuit can exhibit quantum tunneling and discrete energy levels, work recognized by the 2025 Nobel Prize in Physics “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.” He went on to lead the superconducting quantum computing effort at Google, where his team demonstrated large-scale, programmable transmon-based processors, and now heads Qolab (also referred to in the episode as CoLab), a startup focused on advanced fabrication and wafer-scale integration of superconducting qubits. Martinis’s career sits at the intersection of precision instrumentation and systems engineering, drawing on a scientific “family tree” that runs from Cambridge through John Clarke’s group at Berkeley, with strong theoretical influence from Michel Devoret and deep exposure to ion-trap work by Dave Wineland and Chris Monroe at NIST. Today his work emphasizes solving the hardest fabrication and wiring challenges—pursuing high-yield, monolithic, wafer-scale quantum processors that can ultimately host tens of thousands of reproducible qubits on a single 300 mm wafer. Key topics Macroscopic quantum tunneling on a chip: How Clarke, Devoret, and Martinis used a current-biased Josephson junction to show that a macroscopic circuit variable obeys quantum mechanics, with microwave control revealing discrete energy levels and tunneling between states—laying the groundwork for superconducting qubits. The episode connects this early work directly to the Nobel committee’s citation and to today’s use of Josephson circuits as “synthetic atoms” for quantum computing.From DC devices to microwave qubits: Why early Josephson devices were treated as low-frequency, DC elements, and how failed experiments pushed Martinis and collaborators to re-engineer their setups with careful microwave filtering, impedance control, and dilution refrigerators—turning noisy circuits into clean, quantized systems suitable for qubits. This shift to microwave control and readout becomes the through-line from macroscopic tunneling experiments to modern transmon qubits and multi-qubit gates.Synthetic atoms vs natural atoms: The contrast between macroscopic “synthetic atoms” built from capacitors, inductors, and Josephson junctions and natural atomic systems used in ion-trap and neutral-atom experiments by groups such as Wineland and Monroe at NIST, where single-atom control made the quantum nature more obvious. The conversation highlights how both approaches converged on single-particle control, but with very different technological paths and community cultures.Ten-year learning curve for devices: How roughly a decade of experiments on quantum noise, energy levels, and escape rates in superconducting devices built confidence that these circuits were “clean enough” to support serious qubit experiments, just as early demonstrations such as Yasunobu Nakamura’s single-Cooper-pair box showed clear two-level behavior. This foundational work set the stage for the modern era of superconducting quantum computing across academia and industry.Surface code and systems thinking: Why Martinis immersed himself in the surface code, co-authoring a widely cited tutorial-style paper “Surface codes: Towards practical large-scale quantum computation” (Austin G. Fowler, Matteo Mariantoni, John M. Martinis, Andrew N. Cleland, Phys. Rev. A 86, 032324, 2012; arXiv:1208.0928), to translate error-correction theory into something experimentalists could build. He describes this as a turning point that reframed his work at UC Santa Barbara and Google around full-system design rather than isolated device physics.Fabrication as the new frontier: Martinis argues that the physics of decent transmon-style qubits is now well understood and that the real bottleneck is industrial-grade fabrication and wiring, not inventing ever more qubit variants. His company’s roadmap targets wafer-scale integration—e.g., ~100-qubit test chips scaling toward ~20,000 qubits on a 300 mm wafer—with a focus on yield, junction reproducibility, and integrated escape wiring rather than current approaches that tile many 100-qubit dies into larger systems.From lab racks of cables to true integrated circuits: The episode contrasts today’s dilution-refrigerator setups—dominated by bulky wiring and discrete microwave components—with the vision of a highly integrated superconducting “IC” where most of that wiring is brought on-chip. Martinis likens the current state to pre-IC TTL logic full of hand-wired boards and sees monolithic quantum chips as the necessary analog of CMOS integration for classical computing.Venture timelines vs physics timelines: A candid discussion of the mismatch between typical three-to-five-year venture capital expectations and the multi-decade arc of foundational technologies like CMOS and, now, quantum computing. Martinis suggests that the most transformative work—such as radically improved junction fabrication—looks slow and uncompetitive in the short term but can yield step-change advantages once it matures.Physics vs systems-engineering mindsets: How Martinis’s “instrumentation family tree” and exposure to both American “build first, then understand” and French “analyze first, then build” traditions shaped his approach, and how system engineering often pushes him to challenge ideas that don’t scale. He frames this dual mindset as both a superpower and a source of tension when working in large organizations used to more incremental science-driven projects.Collaboration, competition, and pre-competitive science: Reflections on the early years when groups at Berkeley, Saclay, UCSB, NIST, and elsewhere shared results openly, pushing the field forward without cut-throat scooping, before activity moved into more corporate settings around 2010. Martinis emphasizes that many of the hardest scaling problems—especially in materials and fabrication—would benefit from deeper cross-organization collaboration, even as current business constraints limit what can be shared. Papers and research discussed “Energy-Level Quantization in the Zero-Voltage State of a Current-Biased Josephson Junction” – John M. Martinis, Michel H. Devoret, John Clarke, Physical Review Letters 55, 1543 (1985). First clear observation of quantized energy levels and macroscopic quantum tunneling in a Josephson circuit, forming a core part of the work recognized by the 2025 Nobel Prize in Physics. Link: https://link.aps.org/doi/10.1103/PhysRevLett.55.1543“Quantum Mechanics of a Macroscopic Variable: The Phase Difference of a Josephson Junction” – J. Clarke et al., Science 239, 992 (1988). Further development of macroscopic quantum tunneling and wave-packet dynamics in current-biased Josephson junctions, demonstrating that a circuit-scale degree of freedom behaves as a quantum variable. Link (PDF via Cleland group):

    49 min
4.5
out of 5
42 Ratings

About

Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.

You Might Also Like