Beyond the Qubit

Frank Dekker

The nr1 Quantum Technology podcast for investors.

  1. 1D AGO

    What if games are not just a way to explain quantum, but a way to build real quantum intuition?

    What if games are not just a way to explain quantum, but a way to build realquantum intuition? Thatmay sound playful, but the idea is serious. Onereason DeepMind changed the direction of AI is that ittreated games as more than entertainment. They became environments forlearning, experimentation, search, and discovery. Thatmatters for quantum computing too. Becausein quantum, the challenge is not only building betterhardware. It is also learning how to navigate an enormous space of possiblequantum circuits, quantum algorithms, and interactions. Mostof that space is noise. Usefulstructure is rare. Andintuition is hard to build. Thatis where games become interesting. Gamescreate rules, feedback, and goals. They give people a morestructured way to explore complexity. Andif a quantum problem can be turned into somethinggame-like, it may become easier for humans to experiment, easier for creativethinkers to engage, and potentially more accessible to AI methods that havealready proven powerful in game environments. Thatis why this conversation stood out to me. Maybegames can do for quantum computing what they once did forAI: notsolve everything, butcreate the interface that helps people discover whatmatters. Part 2 with Evert van Nieuwenburg is out now on Beyond the Qubit. Do you think games could become a real tool for quantum research andquantum algorithm discovery, or will they remain mostly educational? #QuantumComputing #QuantumAlgorithms #QuantumResearch #AI #Gaming#DeepTech #BeyondTheQubit

    41 min
  2. MAR 6

    Can we model the real device physics before we commit to the lab?

    Here is the questionthat decides whether quantum scales. Can we model the real device physicsbefore we commit to the lab?   Quantum is notblocked by qubits alone. It is blocked by the missing quantum EDA stack. I just recorded a 30minute summary with Jonathon Riddell, CEO of Kothar Computing, and his messageis concrete. A big barrier tobuilding quantum computers today is, in his words, strangely still on theclassical computing side. Because theclassical EDA toolchain was never built to capture the quantum physics thatdetermines whether these devices work. So teams simulate,hand designs to a fab, fabricate the chip, measure it, and only then realizekey effects were not captured before committing to the lab. Jonathon calls thisbeing blind, and he adds an important nuance. That is a bit harsh, because wedo capture some effects. But not the fullphysics we actually care about. That is why quantumEDA matters. He describes it as atop to bottom simulation suite that lets engineers design quantum chips whilecapturing the physics that matters, before expensive lab cycles. Where Kothar fits inthe value chain is as the compute engine underneath that stack. They built alanguage, Aleph, combining symbolic computing with numerical high performancecomputing. The idea is simple.In quantum many body problems, symbolic reduction is leverage. Skip it, and you canend up solving something one hundred to one thousand times harder thannecessary. My investortakeaway. If you only trackqubit counts, you are missing a parallel race.   The 30 min summaryof the deep dive interview with CEO Jonathon Riddell is out now. Link: https://youtu.be/ZAc-1nnNLa0   #quantum #eda#quantumcomputing #semiconductors #chipdesign #hpc #scientificcomputing#deeptech #simulation #designautomation #QuantumSoftware #HybridComputing #HPC #DeveloperTools#ProgrammingLanguages #Compilers #Compilation #Reproducibility #PerformanceEngineering#SoftwareEngineering #ComputeAcceleration #DeepTech #KotharComputing#physics #deeptech #BeyondTheQubit #FutureOfCompute @Kotharcomputing@JonathonRiddell   📌 Disclaimer: This post is shared on a personal basis and I do notrepresent any company

    28 min
  3. FEB 27

    The next Synopsys and Cadence might be built for quantum.

    Most people thinkquantum progress is a qubit story. Jonathon Riddellargues a big part of the bottleneck is the missing quantum EDA stack. In part 2 of my deepdive with Jonathon, CEO of Kothar Computing, the punchline is blunt. We are stillbuilding quantum hardware at the scientific experiment layer, because classicalEDA was never built to capture the quantum physics that determines whetherthese devices work. So teams simulate,hand designs to a fab, fabricate the chip, measure it… and only then realizekey effects weren’t captured before committing to the lab. Jonathon calls thisbeing “blind,” and he also notes that’s a bit harsh because we do capture someeffects. But the pointstands. This is not just aninnovation pace problem. It is a design loopproblem. Here’s my takeawayanalogy. Classicalsemiconductors became an industry when EDA helped turn physics into repeatableengineering workflows. Quantum will needits own version of that. Jonathon’s vision isa vertically integrated quantum EDA workflow, from design choices all the wayto a file you hand off to a fab, without stitching together disjoint tools. He thinks the fieldis heading there within about five years. Kothar’s wedge isthe compute engine underneath that stack. They built alanguage, Aleph, combining symbolic computing with numerical high-performancecomputing. In quantum many-bodyproblems, symbolic reduction matters: skip it, and you can end up solvingsomething 100 to 1000 times harder than necessary. That is also why thepartnership with Nanoacademic Technologies’ QT CAD is telling. QT CAD buildsrealistic bottom-up chip models. Kothar focuses onthe hard quantum many-body solving layer once the models get real. Modeling plussolving. That is how an EDAecosystem forms. One investor linethat stuck with me. The value can becompressing a 12-month project into a week by removing orchestration frictionand making workflows robust enough to scale. If you areunderwriting quantum, track qubit counts. But also ask themore boring, more powerful question. Who is building theEDA layer that makes quantum engineering repeatable? Because the winnersmay not be the team with the most qubits first. They may be the teamwith the fastest design loop. #quantum #eda#quantumcomputing #semiconductors #chipdesign #hpc #scientificcomputing#deeptech #simulation #designautomation 📌 Disclaimer: This post is shared on a personal basisand I do not represent any company

    49 min
  4. FEB 20

    Python is not the problem. The compile gap is.

    Most people talk about quantum as if the hard part is the qubits. In my interview withJonathon Riddell, CEO of Kothar Computing, the bottleneck looked different: theclassical layer that has to run the science. Because real quantumworkflows are hybrid. Quantum plus classical. And hybrid workflowslive or die on orchestration, reproducibility, performance, and deployment. Here is theuncomfortable truth. Python is perfectfor exploration. It is the front door. Python has great compiler-adjacenttools, but the workflow is still fragmented and hard to make robust end-to-end. The pain starts whenyou move from notebooks to real workloads and you need predictable execution,repeatable builds, and optimized, validated runs across heterogeneous hardware. That is the compilegap. The jump fromPython-first workflows to a reliable compilation and transpilation pipelinethat targets CPUs, GPUs, and QPUs. And it shows up everywhere you care about inphysics and quantum: dynamiclanguages make certain classes of errors harder to catch early, and largescientific stacks accumulate risk through runtime shape/type/unit mismatches.Assystems grow, you want more errors caught before runtime, and failures that areloud and actionable.     This is why I findAleph so interesting. Aleph is Kothar’sattempt to raise the ceiling for scientific and quantum computing: a languagedesigned to feel natural for researchers, while still being built forcompilation and performance. The idea is simplebut powerful. Keep the ergonomicsscientists love. Add the compilerbackbone production systems require. Make hybridworkflows feel normal, not fragile. If you are buildingor investing in quantum, I think this framing matters. The winners will notjust have better qubits. They will havebetter tooling that turns quantum into a usable accelerator inside a largerscientific workflow. Part 1 of the deepdive is out now. Link:https://youtu.be/T_idIcdYSgc       Also curious: wheredo you feel the pain most today, compilation, debugging, or reproducibility? #QuantumComputing#QuantumSoftware #HybridComputing #ScientificComputing #HPC #DeveloperTools#ProgrammingLanguages #Compilers #Compilation #Reproducibility #PerformanceEngineering#SoftwareEngineering #ComputeAcceleration #DeepTech #KotharComputing#physics #deeptech #BeyondTheQubit #FutureOfCompute @Kotharcomputing@JonathonRiddell     📌 Disclaimer: This post is shared on a personal basis and I do notrepresent any company

    50 min
  5. FEB 13

    If we hit 100 logical qubits, the conversation around quantum changes fast.

    Because it moves the field from impressive lab demos toworkloads you can actually run. After a year of hosting Beyond the Qubit, I have learnedthis.The real challenge is not the physics.It is knowing what is real progress, while the answer is still uncertain. Here is what I have learned so far. First.Quantum is no longer one story.There are multiple credible technology paths, and it is genuinely difficulttoday to say which one will win. Second.Scaling is still underestimated.Not just more qubits on a chip/unit, but also clustering chips together, andimproving error correction so you need fewer physical qubits for each logicalqubit. Third.A simple truth I keep repeating to myself.A logical qubit is an error corrected qubit you can compute with reliably.And today, the world still has no or only a very small number of them. That is why the next milestone matters. My working heuristic is this.Around 100 logical qubits is where the first meaningful applications may startto appear.And somewhere around 1,000 to 2,000 logical qubits is where many of the bigapplications start to open up, like molecular modelling and large scaleoptimization. The exact number will depend on the application and theerror rates.But the order of magnitude matters. So if the industry reaches 100 logical qubits, it is notjust a benchmark.It is a strong signal that scaling is working.And it makes the path toward 1,000 plus feel less like science fiction and morelike an engineering roadmap. That shift changes things.Capital.Talent.Time horizons.And the way society talks about quantum. Now a second lesson for investors and builders. This market could consolidate around a few winners.That is exciting, but it also means technology risk remains high. So where do you look if you want exposure without betting ona single horse? Suppliers.The enablers. Because scaling does not just mean better chips.It depends on the technology, for super conducting qubits it means morechannels, more calibration, more test, more wiring, more cooling, and bettererror correction tooling. This is why I like studying the enabling layer.Chip testing, control systems, interconnects, cryogenics, and error correctionsoftware.These companies often aim to support more than one quantum technology path,which can mean earlier revenue and lower single technology risk. Two examples I personally find interesting are OrangeQuantum Systems and QC Design.Not investment advice, just examples of the enabling layer. One more observation. The world is spending enormous amounts on compute for AI.Quantum is not the same as AI compute, and AI spend is not a direct driver ofquantum.But it can accelerate adjacent tooling, packaging, photonics, and engineeringtalent that the quantum ecosystem also depends on. Add geopolitics and digital sovereignty, and quantum becomeseven more strategic. So yes.Quantum still has uncertainty. But the direction of travel is clear. The next years are about proving that logical qubits canscale.Through scale up, scale out, and better error correction. That is what I will keep tracking on Beyond the Qubit. Now I am curious about your view. Which unlock do you think comes first on the road to 100logical qubits.Scale up, scale out, or error correction. If you want to receive the presentation, post presentationbelow in the comments. Here are the links for: Youtube:     Spotify:     📌 Disclaimer: Thispost is shared on a personal basis and I do not represent any company #QuantumComputing #QuantumTechnology#FaultTolerantQuantumComputing #LogicalQubits #QuantumErrorCorrection#QuantumHardware #DeepTechInvesting #Semiconductors

    11 min
  6. JAN 30

    Error correction isn’t primarily blocked by physics anymore

    It’s blocked by design choices.” That was one of thestrongest realizations from Part 2 of my deepdive with Ish Dhand, co-founder of QCDesign, on Beyond the Qubit. Most people talkabout fault-tolerant quantum computing as if it’s a single problem. In reality, it’s a design-space explosion. That reframed how Ithink about progress in quantum. What stood out to mein this part of the conversation: • Hardware teamsdon’t struggle with one error, they struggle with many interacting imperfections at the same time • Open-sourcesimulators can scale to thousands of qubits, but usually only by assuming very simplified error models • Real hardware hasto deal with leakage, coherent errors, pulse timing, idling, cross-talk,  all at once • Many of theseeffects only become visible at the scale of thousandsof physical qubits per logical qubit This is where QCDesign plays a unique role. Rather than bettingon a single error-correction code or architecture, they help hardware teams simulate realistic fault-tolerant systems beforebuilding them,  across platforms,codes, decoders, and noise models. What really changedmy perspective: Error correctionisn’t just about finding a better code. It’s aboutunderstanding where engineering effort actuallypays off. If leakage hurtsyour logical qubits more than erasures, why spend yearsoptimizing the wrong thing? If longer pulsesimprove gate fidelity but quietly destroy system performance through idlingerrors, where’s the realoptimum? These aren’tacademic questions. They determine cost, timelines, and whether scaling is even feasible. One line from Ishreally stuck with me: Today, the cost of a truly useful fault-tolerantquantum computer is effectively infinite. The real progress is making that number finite, andthen bringing it down. That single sentencereframes the entire industry. In this episode, wego deep into: • why decoding speedmatters as much as code efficiency • why “software willfix it later” is usually the wrong mindset • why logicalfidelity matters more than raw qubit counts • and why faulttolerance is becoming a full-stack engineeringproblem If you care about how quantum computers will actually be built,  not just announced,  this conversation is worth your time. 🎙️Beyond the Qubit — Part 2 with Ish Dhand 🔗https://youtu.be/ugo3g1Mws2M #FaultTolerantQuantum#QuantumArchitecture #ErrorCorrection#QuantumSoftware #BeyondTheQubit   ⁨@IshDhand⁩ ⁨@QC_Design⁩   📌 Disclaimer: This post is shared on a personal basis and I do notrepresent any company

    49 min

About

The nr1 Quantum Technology podcast for investors.