Tuesday, May 31, 2022

Peak Digital

 Random topic for your end-of-year brain break: I predict that we are at #PeakDigital, within a decade or so. Beyond that, we will have the #SecondQuantumRevolution and the #AnalogRenaissance. The idea of (classical) binary-only data processing will come to seem...quaint.

We will learn to create, process, store, transmit and correct analog data in myriad forms. Same with digital and continuous-variable quantum. All at fidelity and reliability that far exceed vinyl records, film, wax cylinders, differential analyzers and tide calculation machines, allowing essentially infinite rounds of processing and transmission with no detectable degradation.

It will be far faster and more efficient than digital computation, and after some teething pains will come to be seen as flexible and reliable.

People will talk of a quartet of information paradigms -- classical and quantum; digital/discrete and analog/continuous.

Information theory and computational complexity will merge completely.

And the kicker? P ?= NP turns out to be irrelevant, as complexity turns out to be a continuum instead of discrete classes, a topography of hills and valleys, and whether you roll downhill toward polynomial or exponential is a microscopic different on a ridge.

Now, do you think I actually believe this will work out exactly this way? I'll hedge my bets and keep my cards face down. But this might provoke some conversation. What do *YOU* think information will look like in, say, 2075?

https://twitter.com/rdviii/status/1476071997239336961


Saturday, May 28, 2022

From Classical to Quantum Optics Course on YouTube


The Japanese government is funding a group of us, led by Prof. Kae Nemoto (now at Okinawa Institute of Science and Technology), to create an undergraduate quantum computing (engineering?) curriculum. We are calling it Quantum Academy. As of this writing, getting the full experience requires registering for a class at one of the participating institution, but we are making our share of the work available as Creative Commons, CC-BY-SA. Each module is intended to be 1 unit that will satisfy MEXT, about ten to twelve hours of face time with faculty, plus reading & homework. (Most Japanese university courses are two units, a total of 40-50 hours of work, about half the size of a U.S. course.)

Our contribution this year is the module From Classical to Quantum Optics. Actually, technically, that link will take you to our YouTube channel's playlists, which includes not only that module but also last year's Overview of Quantum Communications. Moreover, there are both English and 日本語 versions available! Michal Hajdušek created the materials and the English videos, while I did the Japanese videos.

My apologies, but we are still working on subtitles, especially for Japanese. If anyone would like to do subtitles in another language, please let us know, we are very interested!

Eventually, edited transcripts will be available in book form, as well. Patience, please!

The module begins with the classical wave equation, Fourier analysis, and Maxwell's Laws, then gets into single photons. It follows on nicely from Overview of Quantum Communications, where we talked mostly about qubits as abstract things, but also covered the critical technology of waveguides such as optical fibers, without fully justifying why light can be guided that way. Here, Maxwell's equations and the wave equation shore up that foundation. Here's the outline:

From Classical to Quantum Light

Prerequisites: Overview of Quantum Communications, Linear Algebra, Probability

Prerequisites/co-requisites: Differential Equations, Introductory Partial Differential Equations, Introductory Quantum Mechanics, Classical Optics

Recommended next courses: Quantum Internet (coming in 2023)


An Early, Relatively Complete Quantum Computer Design

 Based on a Twitter thread by yours truly, about research with Thaddeus Ladd and Austin Fowler back in 2009.

In the context of an ongoing #QuantumArchitecture #QuantumComputing project, today we reviewed some of my old work. I believe that in 2009 this was the most complete architectural study in the world.

We started with a technology (optically controlled quantum dots in silicon nanophotonics), an error correction scheme (the then-new surface code) and a workload/goal (factoring a 2048-bit number).

We considered everything.

Optimized implementation of Peter Shor's algorithm (at least the arithmetic, the expensive part). (More recent work by Archimedes Pavlidis and by Craig Gidney goes beyond where we were in 2009.)

How many logical qubits do we need? 6n = 12K.

How many logical Toffoli gates? A LOT.

So how low a residual logical gate error can we allow?

Given that, and a proposed physical gate error rate, how much distillation do we need? How much should we allow for "wiring", channels to move the logical qubits around?

We ended up with 65% of the space on the surface code lattice for distillation, 25% for wiring, and only 10% for the actual data.

From here, we estimated the code distance needed. High, but not outrageous, in our opinion. (More on that below.)

With micron-sized dots and waveguides, we had already grokked that a multi-chip system was necessary. So we already knew we were looking at at least a two-level system for the surface code lattice, with some stabilizers measured fast and others slow.

We worked through various designs, and wound up with one with several different types of connections between neighbors. See the labels on the right ("W connection", "C connection", etc.) in this figure from the paper. This is the system design.



Turns out each type has a different connection rate and a different fidelity, so we need purification on the Bell pairs created between ancillary dots, before we can use them for the CNOT gates for stabilizer measurements. This could mean that portions of the lattice run faster and portions run slower.

Oh, and an advance in this architecture that I think is under-appreciated: the microarchitecture is designed to work around nonfunctional dots and maintain a complete lattice. I have slides on that, but sadly they didn't make it into the paper, but see Sec. 2.1.

Put it all together, and it's an enormous system. How big?

Six billion qubits.

So if you have heard somewhere that it takes billions of qubits to factor a large number, this might be where it started. But that was always a number with a lot of conditions on it. You really can't just toss out a number, you have to understand the system top to bottom.

Your own system will certainly vary.

The value in this paper is in the techniques used to design and analyze the system.

You have to lay out the system in all its tedious detail, like in this table that summarizes the design.



That lattice constant in the table, btw, is one edge of a surface code hole, so the actual code distance is 4x that, or d = 56. We were aiming for just a factor of three below the surface code threshold, and for a huge computation. These days, most people will tell you hardware needs to be 10x better than threshold for QEC to be feasible. If not, you end up with numbers like these.

You can pretty much guess who did what. Austin Fowler (then in Waterloo, now at Google) did the QEC analysis. Thaddeus Ladd (then at Stanford with Yoshi Yamamoto, now at HRL) did the quantum dot analysis and simulation. I brought the arithmetic expertise and overall system view. We all worked super hard on the chip layout and how all the pieces fit together to make a system.

These things are beasts, with so many facets of the design problem, that we need all hands to make them work!


Quoted/Cited in the Press

 A couple of places that the press has quoted me or cited my research recently, on Quantum Internet stuff. (This blog posting is more for my own reference than anything else.  Feel free to ignore it.)


  • Physics World on the Hanson group's new paper, showing two-hop teleportation after entanglement swapping in NV diamond.
  • Nikkei talking about Quantum Internet and about our work,  but not directly quoting me.  (In Japanese, and behind a paywall.)