The New Quantum Era - innovation in quantum computing, science and technology
The New Quantum Era - innovation in quantum computing, science and technology

The New Quantum Era - innovation in quantum computing, science and technology

Sebastian Hassinger

Overview
Episodes

Details

Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.

Recent Episodes

Regional quantum development with Alejandra Y. Castillo
JAN 19, 2026
Regional quantum development with Alejandra Y. Castillo
Alejandra Y. Castillo, former Assistant Secretary of Commerce for Economic Development and now Chancellor Senior Fellow for Economic Development at Purdue University Northwest, joins your host, Sebastian Hassinger, to discuss how quantum technologies can drive inclusive regional economic growth and workforce development. She shares lessons from federal policy, Midwest tech hubs, and cross-state coalitions working to turn quantum from lab research into broad-based opportunity.Themes and key insightsQuantum as near-term and multi-faceted: Castillo pushes back on the idea that quantum is distant, emphasizing that computing, sensing, and communications are already maturing and attracting serious investment from traditional industries like biopharma.From federal de-risking to regional ecosystems: She describes the federal role as de-risking early innovation through programs under the CHIPS and Science Act while stressing that long-term success depends on regional coalitions across states, universities, industry, philanthropy, and local government.Inclusive workforce and supply-chain planning: Castillo argues that “quantum workforce” must go beyond PhDs to include a mapped ecosystem of jobs, skills, suppliers, housing, and infrastructure so that local communities see quantum as opportunity, not displacement.National security, urgency, and inclusion: She frames sustained quantum investment as both an economic and national security imperative, warning that inconsistent U.S. funding risks falling behind foreign competitors while also noting that private capital alone may ignore inclusion and regional equity.Notable quotes“We either focus on the urgency or we’re going to have to focus on the emergency.”“No one state is going to do this… This is a regional play that we will be called to answer for the sake of a national security play as well.”“We want to make sure that entire regions can actually reposition themselves from an economic perspective, so that people can stay in the places they call home—now we’re talking about quantum.”“Are we going to make that same mistake again, or should we start to think about and plan how quantum is going to also impact us?”Articles, papers, and initiatives mentionedAmerica's quantum future depends on regional ecosystems like Chicago's — Alejandra’s editorial in Crain’s Chicago Business calling for sustained, coordinated investment in quantum as a national security and economic priority, highlighting the role of the Midwest and tech hubs.CHIPS and Science Act (formerly “Endless Frontier”) — U.S. legislation that authorized large-scale funding for semiconductors and science, enabling EDA’s Tech Hubs and NSF’s Engines programs to back regional coalitions in emerging technologies like quantum.EDA Tech Hubs and NSF Engines programs — Federal initiatives that fund multi-state consortiums combining universities, companies, and civic organizations to build durable regional innovation ecosystems, including quantum-focused hubs in the Midwest.National Quantum Algorithms Center — This center explores quantum algorithms for real-world problems such as natural disasters and biopharma discovery, aiming to connect quantum advances directly to societal challenges.Roberts Impact Lab at Purdue Northwest (with Quantum Corridor) – A testbed and workforce development center focused on quantum, AI, and post-quantum cryptography, designed to prepare local talent and companies for quantum-era applications.Chicago Quantum Exchange and regional partners (Illinois, Indiana, Wisconsin) – A multi-university and multi-state collaboration that pioneered a model for regional quantum ecosystems.
play-circle icon
32 MIN
Majorana qubits with Chetan Nayak
JAN 12, 2026
Majorana qubits with Chetan Nayak
In this episode of The New Quantum Era, your host Sebastian Hassinger is joined by Chetan Nayak, Technical Fellow at Microsoft, professor of physics at the University of California Santa Barbara, and driving force behind Microsoft's quantum hardware R&D program. They discuss a modality of qubit that has not been covered on the podcast before, based on Majorana fermonic behaviors, which have the promise of providing topological protection against the errors which are such a challenge to quantum computing. Guest Bio Chetan Nayak is a Technical Fellow at Microsoft and leads the company’s topological quantum hardware program, including the Majorana‑1 processor based on Majorana‑zero‑mode qubits.  He is also a professor of physics at UCSB and a leading theorist in topological phases of matter, non‑Abelian anyons, and topological quantum computation.  Chetan co‑founded Microsoft’s Station Q  in 2005, building a bridge from theoretical proposals for topological qubits to engineered semiconductor–superconductor devices. What we talk about Chetan’s first exposure to quantum computing in Peter Shor’s lectures at the Institute for Advanced Study, and how that intersected with his PhD work with Frank Wilczek on non‑Abelian topological phases and Majorana zero modes.  The early days of topological quantum computation: fractional quantum Hall states at , emergent quasiparticles, and the realization that braiding these excitations naturally implements Clifford gates.  How Alexei Kitaev’s toric‑code and Majorana‑chain ideas connected abstract topology to concrete condensed‑matter systems, and led to Chetan’s collaboration with Michael Freedman and Sankar Das Sarma.  The 2005 proposal for a gallium‑arsenide quantum Hall device realizing a topological qubit, and the founding of Station Q to turn such theoretical blueprints into experimental devices in partnership with academic labs.  Why Microsoft pivoted from quantum Hall platforms to semiconductor–superconductor nanowires: leveraging the Fu–Kane proximity effect, spin–orbit‑coupled semiconductors, and a huge material design space—while wrestling with the challenges of interfaces and integration.  The evolution of the tetron architecture: two parallel topological nanowires with four Majorana zero modes, connected by a trivial superconducting wire and coupled to quantum dots that enable native Z‑ and X‑parity loop measurements.  How topological superconductivity allows a superconducting island to host even or odd total electron parity without a local signature, and why that nonlocal encoding provides hardware‑level protection for the qubit’s logical 0 and 1.  Microsoft’s roadmap in a 2D “quality vs. complexity” space: improving topological gap, readout signal‑to‑noise, and measurement fidelity while scaling from single tetrons to error‑corrected logical qubits and, ultimately, utility‑scale systems.  Error correction on top of topological qubits: using surface codes and Hastings–Haah Floquet codes with native two‑qubit parity measurements, and targeting hundreds of physical tetrons per logical qubit and thousands of logical qubits for applications like Shor’s algorithm and quantum chemistry.  Engineering for scale: digital, on–off control of quantum‑dot couplings; cryogenic CMOS to fan out control lines inside the fridge; and why tetron size and microsecond‑scale operations sit in a sweet spot for both physics and classical feedback.  Where things stand today: the Majorana‑1 chiplet, recent tetron loop‑measurement experiments, DARPA’s US2QC program, and how external users—starting with government and academic partners—will begin to access these devices before broader Azure Quantum integration. Papers and resources mentionedThese are representative papers and resources that align with topics and allusions in the conversation; they are good entry points if you want to go deeper.Non‑Abelian Anyons and Topological Quantum Computation – S. Das Sarma, M. Freedman, C. Nayak, Rev. Mod. Phys. 80, 1083 (2008); Early device proposalsSankar Das Sarma, Michael Freedman, and Chetan Nayak, “Topological quantum computation,” Physics Today 59(7), 32–38 (July 2006).Roadmap to fault‑tolerant quantum computation using topological qubits – C. Nayak et al., arXiv:2502.12252. Distinct lifetimes for X and Z loop measurements in a Majorana tetron - C. Nayaak et al., arXiv:2507.08795.Majorana qubit codes that also correct odd-weight errors - S. Kundu and B. Reichardt, arXiv:2311.01779. Microsoft's Majorana 1 chip carves new path for quantum computing, Microsoft blog post 
play-circle icon
63 MIN
Peaked quantum circuits with Hrant Gharibyan
DEC 12, 2025
Peaked quantum circuits with Hrant Gharibyan
In this episode of The New Quantum Era, Sebastian talks with Hrant Gharibyan, CEO and co‑founder of BlueQubit, about “peaked circuits” and the challenge of verifying quantum advantage. They unpack Scott Aaronson and Yuxuan Zhang’s original peaked‑circuit proposal, BlueQubit’s scalable implementation on real hardware, and a new public challenge that invites the community to attack their construction using the best classical algorithms available. Along the way, they explore how this line of work connects to cryptography, hardness assumptions, and the near‑term role of quantum devices as powerful scientific instruments.Topics CoveredWhy verifying quantum advantage is hard The core problem: if a quantum device claims to solve a task that is classi-cally intractable, how can anyone check that it did the right thing? Random circuit sampling (as in Google’s 2019 “supremacy” experiment and follow‑on work from Google and Quantinuum) is believed to be classically hard to simulate, but the verification metrics (like cross‑entropy benchmarking) are themselves classically intractable at scale.What are peaked circuits? Aaronson and Zhang’s idea: construct circuits that look like random circuits in every respect, but whose output distribution secretly has one special bit string with an anomalously high probability (the “peak”). The designer knows the secret bit string, so a quantum device can be verified by checking that measurement statistics visibly reveal the peak in a modest number of shots, while finding that same peak classically should be as hard as simulating a random circuit.BlueQubit’s scalable construction and hardware demo BlueQubit extended the original 24‑qubit, simulator‑based peaked‑circuit construction to much larger sizes using new classical protocols. Hrant explains their protocol for building peaked circuits on Quantinuum’s H2 processor with around 56 qubits, thousands of gates, and effectively all‑to‑all connectivity, while still hiding a single secret bit string that appears as a clear peak when run on the device.Obfuscation tricks and “quantum steganography” The team uses multiple obfuscation layers (including “swap” and “sweeping” tricks) to transform simple peaked circuits into ones that are statistically indistinguishable from generic random circuits, yet still preserve the hidden peak.The BlueQubit Quantum Advantage Challenge To stress‑test their hardness assumptions, BlueQubit has published concrete circuits and launched a public bounty (currently a quarter of a bitcoin) for anyone who can recover the secret bit string classically. The aim is to catalyze work on better classical simulation and de‑quantization techniques; either someone closes the gap (forcing the protocol to evolve) or the standing bounty helps establish public trust that the task really is classically infeasible.Potential cryptographic angles Although the main focus is verification of quantum advantage, Hrant outlines how the construction has a cryptographic flavor: a secret bit string effectively acts as a key, and only a sufficiently powerful quantum device can efficiently “decrypt” it by revealing the peak. Variants of the protocol could, in principle, yield schemes that are classically secure but only decryptable by quantum hardware, and even quantum‑plus‑key secure, though this remains speculative and secondary to the verification use case. From verification protocol to startup roadmap Hrant positions BlueQubit as an algorithm and capability company: deeply hardware‑aware, but focused on building and analyzing advantage‑style algorithms tailored to specific devices. The peaked‑circuit work is one pillar in a broader effort that includes near‑term scientific applications in condensed‑matter physics and materials (e.g., Fermi–Hubbard models and out‑of‑time‑ordered correlators) where quantum devices can already probe regimes beyond leading classical methods.Scientific advantage today, commercial advantage tomorrow Sebastian and Hrant emphasize that the first durable quantum advantages are likely to appear in scientific computing—acting as exotic lab instruments for physicists, chemists, and materials scientists—well before mass‑market “killer apps” arrive. Once robust, verifiable scientific advantage is established, scaling to larger models and more complex systems becomes a question of engineering, with clear lines of sight to industrial impact in sectors like pharmaceuticals, advanced materials, and manufacturing.The challenge: https://app.bluequbit.io/hackathons/
play-circle icon
29 MIN
Diamond vacancies and scalable qubits with Quantum Brilliance
DEC 6, 2025
Diamond vacancies and scalable qubits with Quantum Brilliance
Episode overviewThis episode of The New Quantum Era features a conversation with Quantum Brilliance co‑founder and CEO Mark Luo and independent board chair Brian Wong about diamond nitrogen vacancy (NV) centers as a platform for both quantum computing and quantum sensing. The discussion covers how NV centers work, what makes diamond‑based qubits attractive at room temperature, and how to turn a lab technology into a scalable product and business.What are diamond NV qubits?  Mark explains how nitrogen vacancy centers in synthetic diamond act as stable room‑temperature qubits, with a nitrogen atom adjacent to a missing carbon atom creating a spin system that can be initialized and read out optically or electronically. The rigidity and thermal properties of diamond remove the need for cryogenics, complex laser setups, and vacuum systems, enabling compact, low‑power quantum devices that can be deployed in standard environments.Quantum sensing to quantum computing  NV centers are already enabling ultra‑sensitive sensing, from nanoscale MRI and quantum microscopy to magnetometry for GPS‑free navigation and neurotech applications using diamond chips under growing brain cells. Mark and Brian frame sensing not as a hedge but as a volume driver that builds the diamond supply chain, pushes costs down, and lays the manufacturing groundwork for future quantum computing chips.Fabrication, scalability, and the value chain  A key theme is the shift from early “shotgun” vacancy placement in diamond to a semiconductor‑style, wafer‑like process with high‑purity material, lithography, characterization, and yield engineering. Brian characterizes Quantum Brilliance’s strategy as “lab to fab”: deciding where to sit in the value chain, leveraging the existing semiconductor ecosystem, and building a partner network rather than owning everything from chips to compilers.Devices, roadmaps, and hybrid nodes  Quantum Brilliance has deployed room‑temperature systems with a handful of physical qubits at Oak Ridge National Laboratory, Fraunhofer IAF, and the Pawsey Supercomputing Centre. Their roadmap targets application‑specific quantum computing with useful qubit counts toward the end of this decade, and lunchbox‑scale, fault‑tolerant systems with on the order of 50–60 logical qubits in the mid‑2030s.Modality tradeoffs and business discipline  Mark positions diamond NV qubits as mid‑range in both speed and coherence time compared with superconducting and trapped‑ion systems, with their differentiator being compute density, energy efficiency, and ease of deployment rather than raw gate speed. Brian brings four decades of experience in semiconductors, batteries, lidar, and optical networking to emphasize milestones, early revenue from sensing, and usability—arguing that making quantum devices easy to integrate and operate is as important as the underlying physics for attracting partners, customers, and investors.Partners and ecosystem  The episode underscores how collaborations with institutions such as Oak Ridge, Fraunhofer, and Pawsey, along with industrial and defense partners, help refine real‑world requirements and ensure the technology solves concrete problems rather than just hitting abstract benchmarks. By co‑designing with end users and complementary hardware and software vendors, Quantum Brilliance aims to “democratize” access to quantum devices, moving them from specialized cryogenic labs to desks, edge systems, and embedded platforms.
play-circle icon
36 MIN
Macroscopic Quantum Tunneling with Nobel Laureate John Martinis
NOV 26, 2025
Macroscopic Quantum Tunneling with Nobel Laureate John Martinis
Episode overviewJohn Martinis, Nobel laureate and former head of Google’s quantum hardware effort, joins Sebastian Hassinger on The New Quantum Era to trace the arc of superconducting quantum circuits—from the first demonstrations of macroscopic quantum tunneling in the 1980s to today’s push for wafer-scale, manufacturable qubit processors. The episode weaves together the physics of “synthetic atoms” built from Josephson junctions, the engineering mindset needed to turn them into reliable computers, and what it will take for fabrication to unlock true large-scale quantum systems.Guest bioJohn M. Martinis is a physicist whose experiments on superconducting circuits with John Clarke and Michel Devoret at UC Berkeley established that a macroscopic electrical circuit can exhibit quantum tunneling and discrete energy levels, work recognized by the 2025 Nobel Prize in Physics “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.” He went on to lead the superconducting quantum computing effort at Google, where his team demonstrated large-scale, programmable transmon-based processors, and now heads Qolab (also referred to in the episode as CoLab), a startup focused on advanced fabrication and wafer-scale integration of superconducting qubits.Martinis’s career sits at the intersection of precision instrumentation and systems engineering, drawing on a scientific “family tree” that runs from Cambridge through John Clarke’s group at Berkeley, with strong theoretical influence from Michel Devoret and deep exposure to ion-trap work by Dave Wineland and Chris Monroe at NIST. Today his work emphasizes solving the hardest fabrication and wiring challenges—pursuing high-yield, monolithic, wafer-scale quantum processors that can ultimately host tens of thousands of reproducible qubits on a single 300 mm wafer.Key topicsMacroscopic quantum tunneling on a chip: How Clarke, Devoret, and Martinis used a current-biased Josephson junction to show that a macroscopic circuit variable obeys quantum mechanics, with microwave control revealing discrete energy levels and tunneling between states—laying the groundwork for superconducting qubits. The episode connects this early work directly to the Nobel committee’s citation and to today’s use of Josephson circuits as “synthetic atoms” for quantum computing.From DC devices to microwave qubits: Why early Josephson devices were treated as low-frequency, DC elements, and how failed experiments pushed Martinis and collaborators to re-engineer their setups with careful microwave filtering, impedance control, and dilution refrigerators—turning noisy circuits into clean, quantized systems suitable for qubits. This shift to microwave control and readout becomes the through-line from macroscopic tunneling experiments to modern transmon qubits and multi-qubit gates.Synthetic atoms vs natural atoms: The contrast between macroscopic “synthetic atoms” built from capacitors, inductors, and Josephson junctions and natural atomic systems used in ion-trap and neutral-atom experiments by groups such as Wineland and Monroe at NIST, where single-atom control made the quantum nature more obvious. The conversation highlights how both approaches converged on single-particle control, but with very different technological paths and community cultures.Ten-year learning curve for devices: How roughly a decade of experiments on quantum noise, energy levels, and escape rates in superconducting devices built confidence that these circuits were “clean enough” to support serious qubit experiments, just as early demonstrations such as Yasunobu Nakamura’s single-Cooper-pair box showed clear two-level behavior. This foundational work set the stage for the modern era of superconducting quantum computing across academia and industry.Surface code and systems thinking: Why Martinis immersed himself in the surface code, co-authoring a widely cited tutorial-style paper “Surface codes: Towards practical large-scale quantum computation” (Austin G. Fowler, Matteo Mariantoni, John M. Martinis, Andrew N. Cleland, Phys. Rev. A 86, 032324, 2012; arXiv:1208.0928), to translate error-correction theory into something experimentalists could build. He describes this as a turning point that reframed his work at UC Santa Barbara and Google around full-system design rather than isolated device physics.Fabrication as the new frontier: Martinis argues that the physics of decent transmon-style qubits is now well understood and that the real bottleneck is industrial-grade fabrication and wiring, not inventing ever more qubit variants. His company’s roadmap targets wafer-scale integration—e.g., ~100-qubit test chips scaling toward ~20,000 qubits on a 300 mm wafer—with a focus on yield, junction reproducibility, and integrated escape wiring rather than current approaches that tile many 100-qubit dies into larger systems.From lab racks of cables to true integrated circuits: The episode contrasts today’s dilution-refrigerator setups—dominated by bulky wiring and discrete microwave components—with the vision of a highly integrated superconducting “IC” where most of that wiring is brought on-chip. Martinis likens the current state to pre-IC TTL logic full of hand-wired boards and sees monolithic quantum chips as the necessary analog of CMOS integration for classical computing.Venture timelines vs physics timelines: A candid discussion of the mismatch between typical three-to-five-year venture capital expectations and the multi-decade arc of foundational technologies like CMOS and, now, quantum computing. Martinis suggests that the most transformative work—such as radically improved junction fabrication—looks slow and uncompetitive in the short term but can yield step-change advantages once it matures.Physics vs systems-engineering mindsets: How Martinis’s “instrumentation family tree” and exposure to both American “build first, then understand” and French “analyze first, then build” traditions shaped his approach, and how system engineering often pushes him to challenge ideas that don’t scale. He frames this dual mindset as both a superpower and a source of tension when working in large organizations used to more incremental science-driven projects.Collaboration, competition, and pre-competitive science: Reflections on the early years when groups at Berkeley, Saclay, UCSB, NIST, and elsewhere shared results openly, pushing the field forward without cut-throat scooping, before activity moved into more corporate settings around 2010. Martinis emphasizes that many of the hardest scaling problems—especially in materials and fabrication—would benefit from deeper cross-organization collaboration, even as current business constraints limit what can be shared.Papers and research discussed“Energy-Level Quantization in the Zero-Voltage State of a Current-Biased Josephson Junction” – John M. Martinis, Michel H. Devoret, John Clarke, Physical Review Letters 55, 1543 (1985). First clear observation of quantized energy levels and macroscopic quantum tunneling in a Josephson circuit, forming a core part of the work recognized by the 2025 Nobel Prize in Physics. Link: https://link.aps.org/doi/10.1103/PhysRevLett.55.1543“Quantum Mechanics of a Macroscopic Variable: The Phase Difference of a Josephson Junction” – J. Clarke et al., Science 239, 992 (1988). Further development of macroscopic quantum tunneling and wave-packet dynamics in current-biased Josephson junctions, demonstrating that a circuit-scale degree of freedom behaves as a quantum variable. Link (PDF via Cleland group):
play-circle icon
49 MIN