Tuesday, April 11, 2023

Twentieth Anniversary

 Last month (March 2023) was the twentieth anniversary of me moving from classical into quantum computing "full time". I'll always remember exactly when this was, because I was working for Nokia at the time, and had already decided that I wanted to move from engineering back into research and in particular to pursue quantum computing. I was out on parental leave when our second daughter was born, when it was announced that the unit I was part of was being shut down; some people moved into other internal positions, others were let go. I had been trying to get Nokia Research to take me on and let me move to Tokyo to be a Ph.D. student at Keio, but we couldn't work it out, so I agreed to take the layoff buyout. We parted quite amicably, I am still close with a few of the people from that era.

So.

Between the Network Alchemy stock options from the Nokia acquisition, plus the buyout, plus some money we made from selling our house in Half Moon Bay, I had enough to pay for a Ph.D. and eventually put a down payment on our house here in Kamakura. So I didn't worry about money, initially (I should have been worrying much more about it for thirty years, but that's a different story).

We took care of our kids, sold the house, did a month-long cross-country road trip, did various things, and moved to Japan in early August 2003. I attended my first quantum computing conference, the fourth EQIS (now known as AQIS) at the end of August or early September before officially becoming a Ph.D. student in late September.

I came to Keio to be Kohei Itoh's Ph.D. student, but I was afraid I couldn't get into the applied physics Ph.D. program, so I applied in CS and wound up with Fumio Teraoka, a respected networking researcher, as my official advisor. (Tera-san once said to me, roughly, "You got Paul Mockapetris and Bob Hinden to write you recommendation letters?!? Geez...") Kohei was my unofficial advisor, and Jun Murai an occasional mentor and later unofficially on my Ph.D. committee. Kohei quickly introduced me to Kae Nemoto, who became an important collaborator and also joined my Ph.D. committee. Kohei is now president of Keio, making him, to the best of my knowledge, the first quantum computing person to run a major university anywhere in the world.

I did my thesis defense June 13, 2006, if I remember right, then the final exam a few weeks later and graduated that September, 36 months after starting. Not too bad.

It was, and remains, quite the adventure. I am glad I made this choice. Thanks to all who have supported me or chosen to run alongside me on this quixotic quest.

Thursday, March 16, 2023

RFC9340: the First Quantum RFC!

 Request for Comments (RFC) 9340: Architecture Principles for a Quantum Internet is now published! Thanks to Wojtek, who originated it, and Stephanie, Shota, Marcello, Angela Sara, and Bruno, who all made invaluable contributions to it.

RFCs are the technical documents that describe the Internet. They come in several "streams", or publishing tracks. This one is from the Internet Research Task Force (IRTF) stream. They may have one of several statuses, including "Standards Track", "Standard", or "Experimental". This one is "Informational", a common type for things that are ready to share and to be haggled over in a public forum, but don't yet directly impact the operation of broad swathes of the Internet itself.

Creation of this document took four years and almost a dozen formal drafts in a public forum, with lots of comments not only from the authors but from other members of the IETF and QIRG communities. Thanks, all!

Reducing Errors by...Using Errors

 Just a quick note: our newest paper, "Leveraging hardware-control imperfections for error mitigation via generalized quantum subspace", by Yasuhiro Ohkura, Suguru Endo, Takahiko Satoh, Rodney Van Meter, Nobuyuki Yoshioka, is up on the arXiv at https://arxiv.org/abs/2303.07660!

This paper goes into a lot of detail on error mitigation techniques. Basically, in quantum error mitigation, you add errors into your quantum circuit in order to understand the effect of errors on the answer to your problem. For example, you introduce some waiting time between finishing the computation and measuring the results, to see how decoherence affects the state. Or, you add a couple of gates that, in theory, cancel out, and see how the error rate changes. These kinds of techniques have the potential to extend the range of problems that can be solved with noisy quantum computers.

This paper, joint work with NTT and Todai/RIKEN/JST (one author has three affiliations), represents an amazing amount of learning by Yasuhiro (who goes by the nickname "rum"). The math in it is captured fairly succinctly, but in fact there is a lot more underlying it, and I am deeply impressed by rum's rate of learning over the last year. He has used an astounding amount of computation time on specific IBM processors during the creation of this paper, and I would say it has paid off. I think this paper will help guide others who have read a little about error mitigation, but aren't sure how to apply it in their own context. It's the kind of practical paper that bridges theory and implementation that I consider a signature of my group.

Thursday, February 09, 2023

Cloud Cover

 If you're looking for information on cloud cover, this website has you, er, covered:

https://www.earthenv.org/cloud

Claims to be 1km resolution; you can certainly zoom way in. Data is available Creative Commons, too.

A table of data for a few cities in Japan in available here.

Sunday, January 15, 2023

Some Recent Important Quantum Theory Results, Part One: MIP* = RE

 While preparing for class, and for my general edification, I have been trying to organize thoughts about recent important results in theory and algorithms. The latter I try to keep up with on general principles, but the former is largely above my station on the mountain, as I stand on the col looking up at the peak through the swirling clouds of my own ignorance. I'm an engineer, and proud of it, and so I have an engineer's understanding of complexity theory. In fact, I have the understanding of an engineer trained in the 1980s; the theorists have been busy since then, and have really changed the landscape! So, take all that I say here with a grain of salt...


MIP* = RE 

(announced in January 2020, with evaluation ongoing) 
Zhengfeng Ji, Anand Natarajan, Thomas Vidick, John Wright, Henry Yuen

I'm not part of the CS theory community, but I could hear the trumpets, kazoos and champagne corks from over here in the Engineering quad on our global, virtual campus. This paper is considered a big deal, assuming it's right, to which most people seem to have given provisional assent.

The introductory paragraph of the CACM version summarizes forty years of theory -- exactly what I need. The key advances: understanding the importance of randomness (in both the action of the algorithm, and in a willingness to accept a probabilistic answer, I believe), interactive proofs, and multiple interactive provers. The authors then go on to cite the addition of entanglement as the fourth major shift. I feel like if I understood this paragraph alone, I would be a far better researcher.

Backing up a bit, computational complexity classes...checking The Complexity Zoo, we see that there are currently 546 complexity classes acknowledged there. Back when I was an undergrad, I learned about 2.5 of them: P, NP, and "bigger than NP", so things have changed a bit. To understand the importance of this paper, we are going to have to understand IP, MIP, MIP*, and RE and their relationship to some of the other basics. Okay, the Complexity Zoo itself is pretty intimidating, so let's try the introductory Petting Zoo, which introduces sixteen important classes...none of which are IP, MIP, MIP*, or RE.  All right, let's roll up our sleeves and see what we know...

  • P is the class of problems solvable in polynomial time -- the simple ones.
  • NP does not mean "non-polynomial", it means non-deterministic polynomial. If you have a Turing machine that solves problems with high combinatoric complexity, then the challenge is to pick good potential answers to try. If trying one solution and failing doesn't help a whole lot in eliminating other possible solutions, you might be in NP territory. Typically, if the size of the problem is n, then the number of possible solutions is O(e^n). But the formal definition of NP is such that the Turing machine scratches its head, shrugs its shoulders, and somehow magically picks the right answer each time, in polynomial time. The practical implication is that there is an exponential number of potential answers that have to be checked one by one, but once you have found an answer it is easy to verify that the answer is correct.
  • MA is Merlin-Arthur, and is the first time we run into two parties in our theoretical conception and proofs. Merlin has infinite computational power, and wants to prove to Arthur that he knows the solution to a problem, so Merlin sends Arthur a proof. (This proof must be no more than polynomial in size, relative to the problem size n.)  Arthur has normal, mortal computational power, but he also has access to a source of randomness (coin flips), which he uses after receiving the proof from Merlin. Arthur checks the proof, using his coin flips as needed, but there is a probability he will wrongly decide whether or not the proof is correct. Generally, we require the probability of correctly accepting a true proof to be at least 2/3, and the probability of incorrectly accepting a wrong proof at 1/3. (That is, the probability of false positive or false negative is each no more than 1/3.) This is the introduction of randomness discussed above. Note that, in this conception, only one round of communication is used.
    Merlin can be called the prover (P), and Arthur the verifier (V). Those terms come up a lot later.
    (Theorists were actively working on this around the time I was an undergrad, but I was completely oblivious to it. Babai 1985 is a key reference.)
  • AM is Arthur-Merlin, which is similar except that Arthur makes his coin tosses before Merlin produces the proof, and sends the coin toss results to Merlin, who can use them as he wishes. After Merlin sends the proof to Arthur, Arthur is required to deterministically check the proof.
  • IP is Interactive Proof, also introduced in 1985 by Goldwasser, Micali and Rackoff. It's similar to AM, except that there can be a polynomial number of rounds. This is where the interaction mentioned above comes in.
  • MIP is Multiple Interactive Provers, introduced by Goldwasser & co. in 1988, uses two (or more?) provers. The provers are allowed to decide on a shared strategy or otherwise share information before the proof starts, but not after it begins. Perhaps the easiest way to enforce that non-communication is to verify the position of the two provers, and require their answers to be received by the verifier in time so short that the speed of light prevents them from having communicated with each other. Now we have our multiple provers.
  • RE is Recursively Enumerable, and is the set of all problems that a Turing machine can solve in finite (but not limited to polynomial) time. It's a very big set.
  • ...and for the purposes of this blog entry, there things stood for a long time...until:
  • MIP*: MIP, except that the provers get to share entanglement beforehand. MIP* includes things like the CHSH game.

I'd like to have good pedagogical examples to go with each of those classes, but at the moment, I don't. My apologies; I'll try to add some later. One important note is that from MA on down, a lot of classes are often expressed in terms of cryptographic functions or cryptographic games.

Also, roughly, that list goes from less complex to more complex, with each class including all of the problems in the classes above it on the list, but in general not all of the relationships are exactly known. Most famously, NP is commonly believed to include problems that are outside of P, but that has never been proven.

And so, the proof of MIP* = RE...drum roll, please...

I don't understand it.

But I'm not alone, most people don't. It's a complicated thing.

In fact, the link above is to the CACM version of the paper, which is written for readers like me, but the full paper is 237 pages. Not very many people have even read the whole thing.

But, I think I am safe in saying this: Very, very, very roughly, it is now (provisionally) known that polynomial amounts of entanglement can be used to solve some very enormous problems. What "polynomial" means here, and what the constant factors are, and real-world examples of problems that are classically intractable but quantumly tractable in practice, all remain as future work. Henry Yuen even refers to this as an "elephant" they are trying to get a feel for.

I wish I could do a better job of explaining its importance, though.  Instead, I will defer to one of the authors, Henry Yuen, and to Scott Aaronson. Scott in particular links to a lot of other explanations, so you can go from there.


Book Progress: Quantum Communications

Over the Christmas & New Year's break, a lot of editing on the book got done -- thousands of edits, according to git. 14/15 of the chapters are in serious local review. Chapter 15 still needs quite a bit of editing. Boilerplate, exercises, index, etc. are partially done; no cover yet. Glossary and table of notation aren't done yet.

If you want to be an alpha reviewer, let me know! It will be distributed soon.


Thursday, December 15, 2022

Quantum Internet class for undergrads!

It's now official -- next fall I will be teaching an undergraduate class titled "Quantum Internet". As far as I am aware, this is the world's first Quantum Internet course aimed at undergraduates!

Next year, we will have two undergraduate courses:
  •  Quantum Information Processing: intended to be a moderate introduction, suitable not only for those planning to continue in quantum but also those not planning to continue in quantum but who want to understand the key ideas. Intended to be accessible to ambitious freshmen, the only math required is a little bit of linear algebra and discrete probability. This course is based on two short online courses we have created, plus some hands-on exercises. Taught flipped classroom style.
    • "Understanding Quantum Computers" (UQC) -- basic ideas of computation (superposition, interference, entanglement, unitary evolution, measurement, decoherence and no-cloning; algorithms; types of hardware), not a lot of math
    • "Overview of Quantum Communications" (English and Japanese) (OQC) -- a little more math and the basic ideas of quantum communications (teleportation, BB84, purification, entanglement swapping) as well as critical basic technology (especially waveguides/optical fibers and lasers)
    • exercises using IBM machines via GUI & Qiskit
  • Quantum Internet: intended for those wanting to go a little deeper, but not limited to those joining my research group, AQUA. The QIP course above is a strict prerequisite, and this course will also be taught flipped classroom style. There will be substantially more math in this, but it's not purely abstract, most of it is in service of designing real systems. It will be based on our 2nd and 3rd Q-Leap Quantum Academy modules:
    • "From Classical to Quantum Light" (CQL) (English and Japanese) -- Maxwell's equations, single photons, more on waveguides
    • "Quantum Internet" (QI) -- the module we are recording lessons for right now: deeper analysis of errors and error handling, network engineering issues such as routing, multiplexing, security.
    • Exercises are still a little TBD, but will include QuISP and Intel-based exercises
UQC was funded internally by Keio. OQC, CQL and QI are funded by the Japanese government Q-Leap program. Portions of OQC are now being translated into Chinese, Thai, Arabic, Korean and French, thanks to a grant from Intel. More announcements on languages to come!
The material from OQC is now being compiled into a book. We expect an alpha-release draft to be available in January 2023 and a near-final version in 2Q2023. We have not selected a publisher for the book yet, but are looking for one.
All of the materials are or will be available under a Creative Commons license. People are encouraged to reuse, remix, translate, etc. -- just give us credit and make your own work available, too.

Monday, November 21, 2022

Book Progress: Quantum Communications

 Seven of the fifteen chapters in our next book, "Quantum Communications", are ready for local students to use/review. With this momentum, the book should be 2/3 ready in a week. Aiming for 4/5 finished by Dec. 19, then completion of alpha release quality over New Year's.

Sometime in Q1 2023, I expect to open up a PDF for wide review and the git repository for pull requests for corrections or contributions. With luck, the completed book, at least at strong beta reader level, will be on the arXiv April-ish.
The book will be Creative Commons, CC-BY-SA. People will be encouraged to contribute to our version, recompile or restructure for their own local use, or translate into other languages.
Keio and the Japanese government pay me to create knowledge, but it does no good if it's not shared, and I want it to go much farther than just the students of Keio and the other elite Japanese universities. Intel is also providing us with support for education. Let's work toward a high-quality, online, flexible, supported, inclusive, highly available quantum curriculum with global reach, including language.

Sunday, November 06, 2022

Spelunking CACM, Vol. 12 (1969)

 The first half of the year seems to move along smoothly, with a lot of algorithm papers but not much exciting. Automated printed circuit routing with a stepping aperture, by Stanley E. Lass, is the first hardware CAD/routing paper I've seen in CACM, but it cites a couple of things from the early 1960s. I remember VLSI engineers at Caltech in the early 1980s worrying that chips being designed by computers automatically were too complex to verify by hand. Of course, we now think of verification as being a task best suited for automation.

One letter to the editor in June, though, concerns an LA chapter meeting that bothered the author:

My disappointment stems from the subject matter and from the speakers. Three of the last meetings which I attended had to do with social conscience and community aspects. Two of these were on the same subject "Operation Bootstrap," which is an L.A. area self-help program for black people.

If only we had been more successful all the way back in 1969 in creating an inclusive atmosphere, and providing mentoring and help to those who needed it!

 A July paper talks about adding features to Dave Farber's SNOBOL. I haven't programmed in SNOBOL, so there are things I definitely don't grok, but there is a feature called "indirection", in which a variable contains the name of another variable, and via indirection the value of that variable can be read or written. This, of course, requires that the symbol table be available at run time, since the variable name can be constructed by the program! No indication in this paper as to what happens when the name isn't found in the symbol table (or, in modern terms, a dictionary?).

The September issue has a solid theory paper by David L. Parnas, On simulating networks of parallel processes in which simultaneous events may occur. It's dealing with discrete event simulation, and provides formal logic solutions for handling simultaneous events in the simulation of digital logic circuits. This work is roughly contemporaneous with the cooperating sequential processes work of Dijkstra and a little before the communicating sequential processes work of Hoare. Although this paper explicitly discusses the parallel nature of the work to be done and some machines that do the work in parallel, the paper really focuses on how to guarantee an acceptable sequential ordering.

And, in fact, in the following month's issue there is an article by Tony Hoare on An axiomatic basis for computing programming. This particular paper takes an...optimistic?...view of programming?

Computer programming is an exact science in that all the properties of a program and all the consequences of executing it in any given environment can, in principle, be found out from the text of the program itself by means of purely deductive reasoning. 

And then later,

When the correctness of a program, its compiler, and the hardware of the computer have all been established with mathematical certainty, it will be possible to place great reliance on the results of the program, and predict their properties with a confidence limited only by the reliability of the electronics.

Ah, would that it were so...

 

Friday, August 26, 2022

Quantum Ethics, the Conversation Continues

 In the magazine Foreign Policy, Vivek Wadwha and Mauritz Kop have an article titled, "Why Quantum Computing is Even More Dangerous than Artificial Intelligence", published in August 2022. It's one in a series of recent articles by Vivek arguing that technology needs to be regulated to address ethical concerns. In the article, Vivek and Mauritz argue that the unregulated, market-driven, ad hoc use of technology such as AI has produced outcomes that we would all consider undesirable, chiefly through mechanisms such as AI adopting our own biases or misuse of indiscriminately collected personal data. They tie quantum to these same issues, which I think is correct. In fact, I argued in a March blog posting that the ethical issues in quantum are, in the short run, very similar to classical computing, though I consider it too early to connect quantum to AI.

The authors have referenced some valuable online resources, such as the video "Quantum Ethics | A Call to Action", which includes excerpts from John Martinis, Nick Farina, Faye Wattleton, Ilyas Khan, Ilana Wisby, and Abe Asfaw, which I highly recommend.

I would not have picked the adjective "dangerous", but the conversation is critical. We should not be afraid of our technological future, but we should not use it without paying attention to the macro-level changes it brings. I believe quantum technology, the Second Quantum Revolution, will bring tremendous benefits in the long run -- if we use it wisely.

Monday, July 25, 2022

Spelunking CACM, vol. 11 (1968): Operating Systems Bonanza and Ethics, but not the ARPAnet!

If 1967 was a little dry, 1968 more than made up for it. The May issue is worth reading end-to-end! As it happens, that issue contains retypeset (and possibly updated?) versions of papers from the first Symposium on Operating Systems Principles (SOSP), which took place in 1967.  That first SOSP includes Larry Roberts's description of the ARPAnet, though that paper doesn't seem to have made it into CACM. (Some of the links below may be open access via the second link above, if the links below take to you the paywall.)

The year was just chock-full of operating systems papers by names such as Jack Dennis, Butler Lampson, and Edsger Dijkstra, and a comprehensive view of systems by E.L. Harder and another view of things by Jack Dennis, many but not all of them from that SOSP. This year also provided the first proposed ethics guidelines for ACM members, to the best of my recollection.

Donn B. Parker's article, "Rules of Ethics in Information Processing," presents the ACM's Guidelines for Professional Conduct in Information Processing. The article begins with three major points we still debate today, and a fourth (fraudulent programming schools) which seems to have been more transient. It begins:

There are a number of serious ethical problems in the arts and sciences of information processing. A few of these are: invasion of privacy by use of the computer; implications of copyrighting computer programs; and fraudulent programming trade schools. The "little" problems of personal ethics are of equal importance and are often closely related to the more imposing problems.

The original version filled less than a page, divided into rules for relations with the public, with employers and clients, and with other professionals. I can't help but notice the continuous use of the pronoun "he" to describe an ACM member, though honestly that probably would have been not so different even a decade ago.  This set of guidelines has now grown into ACM's Code of Ethics, which everyone in the field should read.

As a Caltech grad, this paper on a means of recovering an on-disk structure after a system crash, caught my eye. Of course it's a systems paper, so I'm naturally interested, but otherwise maybe not so remarkable by today's standards. It's the first non-Knuth-sensei Caltech paper I remember spotting, but I haven't checked the affiliation of every single author. (Knuth-sensei's first paper with a Caltech affiliation might be 1961.) I don't recognize either of the authors, Peter C. Lockemann and W. Dale Knutsen. Caltech punches far above its weight in almost every scientific field, and many Caltech alumni have made major contributions to computing, but computing seems to be relatively weaker than fields like physics and chemistry.

The April issue has a breathtaking, simultaneously contemporary and prescient view of how the need for computers was broadening. "Perhaps the computer is too big for the space vehicle, or to put in your vest pocket," Harder posited. It serves as a harbinger of the papers to come in May.

Daley and Dennis described virtual memory in MULTICS, one of the three or four most influential systems in history. Oppenheimer and Weizer presented their resource management in mid-sized systems. Van Horn of GE described a set of formal criteria for reproducibility in debugging. Interestingly, he used the term "virtual computer", essentially describing a process as an instance of a virtual machine, language very similar to what I use today. And Dennis shows up again, with a position paper arguing in favor of timesharing machines over networks that is well worth a read.

But the star, the paper for the ages, is Dijkstra's "The Structure of the THE Multiprogramming System", which features the observation that "an interrupt system to fall in love with is certainly an inspiring feature." Part of what makes the paper spectacular is that Dijkstra describes their failures and mistakes. We could all learn from this approach! Although perhaps his claim that the delivered system would be perfect should be taken with a grain of salt? The paper doesn't have references(!), but this is a real-world implementation of Dijkstra's Cooperating Sequential Processes, the idea he was developing in the mid- to late-1960s that would become one of the foundations of theoretical work on computing systems. The clever language also makes the paper a pleasure to read. All systems people should read this paper!


Sunday, July 10, 2022

Spelunking CACM, vol. 10 (1967)

In 1967, not a lot caught my eye but these.
Parallelizable arithmetic expressions shows how to take a linear expression and convert it into a tree for maximum parallelizability, assuming you have multiple ALUs, Ina single pass.
4-D hypercube includes some nice b&w wire frame drawings of a rotating hypercube done for a stereo animation. The author noted that he didn't feel that it gave him better insight into actual 4-D geometry.
Simulating a computer system is helpful in designing new systems. This simulation of an S/360 seems pretty sophisticated, including a careful job generator.
QED editor by Peter Deutsch and Butler Lampson, at Berkeley at the time, is a line editor for a teletype terminal. It mentions a few similar programs at other institutions, as well. This one includes search functionality, and the ability to store sequences of commands in a way we would later call editing macros.
This completes my spelunking of the first decade of CACM! Names (like those last two) who would go on to lead the 1970s and become legends have begun to appear.
I'm doing CACM since it seems central to the conversation, but there was also a note back in January about scope. CACM, they say, is mostly about systems, with less about numeric computation and theory. More theoretical work appears more in ACM's Journal and Reports. I'm not even going to attempt to spelunk those, though, just CACM is enough!

Wednesday, June 29, 2022

Icons for Quantum Network Nodes

 The set of icons we created for quantum network nodes, for use in network diagrams, simulators, etc., are available Creative Commons license. Use and share! Communication will be simpler if we all use the same icons. It will be easier for others to quickly grasp what we mean when we show them a diagram.

PNGs with either white or transparent background are available at https://github.com/sfc-aqua/quisp/tree/master/Network%20icons.

Tuesday, June 21, 2022

For My Nephew

 My nephew just finished his frosh year in college, and wants to study...astrophysics?  Not entirely sure yet, but the college where he is has only a basic physics degree anyway. He seems a little lost on what to study, so this posting is a personal one, for him, but feel free to read on if you're in the same situation.

Okay kiddo,

It sounds like you have done the basics of mechanics and geometric optics, but I'm not actually sure what else in physics. I know you haven't done Maxwell's equations and quantum mechanics, but what about special relativity and introductory thermodynamics (statistical physics)? And in math, I know you've done "calculus", by which I gather you mean functions, limits, and integration and differentiation of a single variable. Good start, but we've got a long ways to go.

Despite our conversations, it seems like you don't have a clear picture yet of the curriculum, or even the broad structure of knowledge, in math and physics.  (And since I'm a computer engineer, all of this is from that perspective, of course.) So the first thing to do is to get oriented on what you really need.

Here are a few things to help you understand the structure of the body of knowledge, which should help you figure out what classes to take (or what to study on your own).

Mathematics

For a basic physics degree, you are going to need the following math:
  • Linear Algebra. Multiplying vectors and matrices, solving systems of equations via Gaussian elimination, linear and affine transformations, and eigenvalues and eigenvectors, at least. You'll also be exponentiating matrices ($e^A$, where $A$ is a matrix) in quantum mechanics, and that's easiest if you can diagonalize a matrix. You can start with my linear algebra videos, but there are lots of resources on the web. I recommend the Georgia Tech Interactive Linear Algebra online, interactive textbook. That's pretty deep, you won't need it all right away, but it's there for a reason. There is also an entire 20-hour lecture course posted as two videos in YouTube, by Prof. Jim Hefferon; I haven't watched it, so I don't know how good it is, but the comments and likes are very positive. Khan Academy has an LA course. 3 Blue 1 Brown is one of the best things on the web, and they have Essence of Linear Algebra available (full course here). In short, there are many excellent, free resources available.
  • Probability, both discrete and continuous. Probability distributions, conditional probability, Bayes' Theorem, moments of distributions. For continuous, you'll need integration, so continuous comes after basic calculus.
  • Statistics. I think most physics majors get away with just what they learn in a probability class unless they specialize in statistical mechanics and thermodynamics.
  • Ordinary differential equations (ODEs). You said you've seen that $f'(x) = f(x)$ is solved only by the function $f(x) = e^x$, so you've seen the start of a very deep field.
  • Partial differential equations (PDEs). Next step: derivatives in multiple dimensions. These are equations involving symbols like $\frac{\partial x}{\partial t}$. I first hit this when doing Maxwell's equations; I think it's pretty common for that to happen, but it means you're dealing with both a new math tool and important ideas in physics at the same time, so studying basics of PDEs first is a good idea.
  • Transforms and signal processing are a big deal; you might run into Fourier, Laplace, Z and other transforms. (This is different from the linear and affine transforms above.) Often, these are tools for solving ODEs or PDEs, and might show up in an Applied Math class in your college.
  • Later, you might get into more specialized topics like number theory, group theory, and graph theory. Group theory, for example, shows up in particle physics. The basics of graph theory you can learn very early, actually, and they are critical in computer science but maybe not as much in physics.
I'm sure you can find as much stuff on the later topics as I found for LA.

Physics

Surprisingly, I'm a little less comfortable talking about what basic physics you should study. Let's talk first about optics, since it's one of my favorite parts of physics, demonstrates broadly useful concepts, and happens to be what we are working on for our quantum communications research.

From among the courses we have created already, if you want the basic physics, and to minimize the quantum communications portions, these steps would be very good follow-on to your work on geometric optics. They are a bit idiosyncratic relative to an ordinary course on optics, naturally focusing on what we need for quantum communications, but they will still carry you a long way.
  • OQC, Lesson 5: Coherent Light and Single Photons, leading up to a quick quantitative intro to lasers.
  • OQC, Lesson 6.1-6.2: Interference, group and phase velocity quick introduction to constructive and destructive interference and the notions of group and phase velocity, a distinction that is crucial to understand.
  • OQC, Lesson 7: Waveguides discusses the most important means of guiding light, of which the most famous type is, of course, optical fibers. Total up to here is about two hours worth of video, you can do this in the time it would take you to watch one soccer match.
  • And then most of our entire module From Classical to Quantum Light, which is about 10 hours of material covering wave equations, Fourier analysis, Maxwell's equations governing how electromagnetic waves work both in a vacuum and in materials, and then into more on single photons and the like, including how detectors work. You might want to taper off when you get to the single photon stuff and defer that for after you have had basic quantum mechanics, so let's say you should do the first ten lessons of that, 45 minutes each, so about 7.5 hours.
  • Among the remaining important topics to learn about are how holograms work, more on polarization, and a lot about antenna design. A good optics course will cover these things.
From the above video on the map of physics, you should have some idea of the basic list of topics:
  • mechanics
  • waves
  • optics
  • special relativity
  • electricity & magnetism
  • introductory quantum mechanics
  • thermodynamics
That much you should take as an engineering or physics student of any sort. If you major in physics, you'll add advanced QM, particle physics, general relativity and some other topics to that list.

Given your interest in soccer, you might consider studying biomechanics, which is a bit of a specialized area and interdisciplinary, so a little biology will help, too. You should check out Professor Ohgi's Sports Dynamics and Informatics Lab as a possible destination for grad school. He works with people up to the level of Olympic athletes to understand the forces and dynamics of athletic performance. Studying some computer science before joining his lab would also be helpful.

Enough for now. Go watch those introductory things to get oriented, then study some linear algebra and probability while you watch our online courses on optics and quantum computing and communications.

Sunday, June 12, 2022

Spelunking CACM, vol. 9 (1966): Eliza (and now 2022's LaMDA)

 Eliza has arrived. Eliza, who is a few months older than I am, is inarguably one of the seminal events in computing history. It's just about the only program created in the 1960s that most of us can still spontaneously name. It helped to spur my own interest in computing.

My first computer was a Radio Shack TRS-80 Model I. I also got a book with programs in it, which meant copying the programs in by hand. (Good training for catching syntax errors!)  One of those programs was called Eliza, though I suspect the BASIC version in my book was pretty different, and probably much simpler, than the 1966 version created by Joseph Weizenbaum in SLIP, a language adding list processing features to FORTRAN.

Eliza works by parsing input text into tokens and finding an important phrase (where phrases are delimited by commas or periods) based on the key word in the phrase, using a list of important words that was created by hand. Then it replaces the primary verb or subject in the phrase with one of a set of stock phrases. If you type in "I love my dog," Eliza might respond, "Tell me why you love your dog." Does mean it has any idea at all what a dog is, it just found a simple pattern and followed it.

My TRS-80 was at home, but the high school also had one, and friends of mine also enjoyed playing with Eliza. And what do high school boys want to talk about? Sex and scatalogical things and bad language, of course. You could make Eliza say some pretty hilarious things that way, by the standards of high school boys.

And now, 56 years later, we have a Google engineer and AI ethicist arguing that an intellectual descendant of Eliza has become sentient.

Chat bots have evolved tremendously since then, and are generally connected to a backend system of some sort, so that they can provide airline assistance or what have you. Often, they are connected to a neural net-based AI both for parsing the language and creating the responses. Generally speaking, no one believes they are actually sentient, but the responses will sometimes appear so insightful that your hair stands up on the back of your neck.

My skeptic's nature and my own very limited experience with chat bots cause me to lean toward siding with the Googlers who said there is no evidence that it's sentient, and plenty of evidence against it. But we are now clearly entering the realm where it will get harder and harder to tell, and the stakes of the arguments will continue to grow. Research into AI ethics grows more crucial every day.

Monday, June 06, 2022

Spelunking CACM, vol. 8 (1965): debugging, peephole optimization, and an undergraduate curriculum

(Click on the image if it isn't large enough to read.)

Overall, 1965 seemed like an "ordinary" year at CACM. I didn't see any papers that really caught my eye as some sort of extraordinary event. (Stay tuned for 1966 for that.) But there are a few things of interest, of course.

By 1965, the idea of wide-area networks was definitely in the air. Paul Baran was publishing RAND reports on the key ideas of packet switching (although I'm not sure when those became publicly available), and the ARPANET project would kick off in 1966. But connections between computers were still much more an idea than a reality, and so I.R. Neilsen had a CACM paper on a 5kHz modem for "callup" connections for exchanging medical data.

Evans and Darley presented DEBUG, which worked with other tools to allow smooth insertion of new lines of assembly while debugging a program, preventing what we now call "spaghetti code". DDT already existed for the PDP-1 at this point. I find this early history of debugging intriguing, since I have a Ph.D. student working on debugging for quantum programs now.

Maybe the most interesting trend is a pair of papers (is two enough for a trend?) on optimizing compilers. In June, Nievergelt presented "On the Automatic Simplification of Computer Programs", which proposed a set of backend techniques for individual optimizations to a stream of instructions. The paper enumerates limitations on the use of the proposed techniques, which interesting included the constraint that programs not be self-modifying, evidence that that idea was already in use.

Then in July, McKeeman gave us "Peephole Optimization". He implies that the technique is known but known and "simple" but "often neglected", and also that he is coining the term "peephole optimization" in this paper. Again implemented for the PDP-1, the explanation is clear but limited, being just over a page.

But maybe the most interesting thing as an artifact is the report from the ACM Curriculum Committee on Computer Science titled "An Undergraduate Program in Computer Science-- Preliminary Recommendations". Some earlier papers on education appeared in prior years, but I think this is the first actual organized recommendation from ACM itself. Here was the status at the time:

At the time of this writing, in the United States, doctorates can be sought at more than 15 universities, master's degrees at more than 30, baccalaureates in at least 17 colleges, and a sizeable number of other colleges and universities are presently planning or considering departments of computer science.

Impressive. I didn't know there were that many before I was born.

At the top of this post is the key table. Each of the sixteen courses there has a several-paragraph description that is well worth reading. I'm particularly intrigued that Graph Theory was already recognized as an important element in the curriculum. I am working to introduce a Graph Theory class for undergrads at our campus, for many purposes including economics, social sciences, etc., not just CS.

Each course in that table is assumed to be a U.S. 3-semester-hour course consisting of 3 hours/week x 15 weeks = 45 hours of in-class time and probably 2x that in lab or homework time for a total of, oh, 130-140 hours of work and learning. If you take just the nine "Required" plus "Highly Recommended Electives" courses, you're well over a thousand hours of time invested.  Add the seven "Other Electives", and a university that provided them all (noted in the text to be a difficult task), and you have about 2,100 hours of time. And that's not even counting the "Supporting" courses, let alone any arts and social sciences classes required. In short, this is a pretty thorough and ambitious major!

Saturday, June 04, 2022

古典光学から量子光学へ


 

公開しました!

去年の「量子通信の基礎」の引き続き、「古典光学から量子工学へ」はYouTubeにアップされています。量子通信、量子インターネットなどに興味ありましたら、是非御覧ください。69本のビデオ、10時間ぐらいがあって、以下のトピックがテーマになっている:


などの内容です。今年度(つまり、来年の3月)は「量子インターネット」のモジュールを公開しますので、来年もどうぞよろしくお願いします。

Tuesday, May 31, 2022

Peak Digital

 Random topic for your end-of-year brain break: I predict that we are at #PeakDigital, within a decade or so. Beyond that, we will have the #SecondQuantumRevolution and the #AnalogRenaissance. The idea of (classical) binary-only data processing will come to seem...quaint.

We will learn to create, process, store, transmit and correct analog data in myriad forms. Same with digital and continuous-variable quantum. All at fidelity and reliability that far exceed vinyl records, film, wax cylinders, differential analyzers and tide calculation machines, allowing essentially infinite rounds of processing and transmission with no detectable degradation.

It will be far faster and more efficient than digital computation, and after some teething pains will come to be seen as flexible and reliable.

People will talk of a quartet of information paradigms -- classical and quantum; digital/discrete and analog/continuous.

Information theory and computational complexity will merge completely.

And the kicker? P ?= NP turns out to be irrelevant, as complexity turns out to be a continuum instead of discrete classes, a topography of hills and valleys, and whether you roll downhill toward polynomial or exponential is a microscopic different on a ridge.

Now, do you think I actually believe this will work out exactly this way? I'll hedge my bets and keep my cards face down. But this might provoke some conversation. What do *YOU* think information will look like in, say, 2075?

https://twitter.com/rdviii/status/1476071997239336961


Saturday, May 28, 2022

From Classical to Quantum Optics Course on YouTube


The Japanese government is funding a group of us, led by Prof. Kae Nemoto (now at Okinawa Institute of Science and Technology), to create an undergraduate quantum computing (engineering?) curriculum. We are calling it Quantum Academy. As of this writing, getting the full experience requires registering for a class at one of the participating institution, but we are making our share of the work available as Creative Commons, CC-BY-SA. Each module is intended to be 1 unit that will satisfy MEXT, about ten to twelve hours of face time with faculty, plus reading & homework. (Most Japanese university courses are two units, a total of 40-50 hours of work, about half the size of a U.S. course.)

Our contribution this year is the module From Classical to Quantum Optics. Actually, technically, that link will take you to our YouTube channel's playlists, which includes not only that module but also last year's Overview of Quantum Communications. Moreover, there are both English and 日本語 versions available! Michal Hajdušek created the materials and the English videos, while I did the Japanese videos.

My apologies, but we are still working on subtitles, especially for Japanese. If anyone would like to do subtitles in another language, please let us know, we are very interested!

Eventually, edited transcripts will be available in book form, as well. Patience, please!

The module begins with the classical wave equation, Fourier analysis, and Maxwell's Laws, then gets into single photons. It follows on nicely from Overview of Quantum Communications, where we talked mostly about qubits as abstract things, but also covered the critical technology of waveguides such as optical fibers, without fully justifying why light can be guided that way. Here, Maxwell's equations and the wave equation shore up that foundation. Here's the outline:

From Classical to Quantum Light

Prerequisites: Overview of Quantum Communications, Linear Algebra, Probability

Prerequisites/co-requisites: Differential Equations, Introductory Partial Differential Equations, Introductory Quantum Mechanics, Classical Optics

Recommended next courses: Quantum Internet (coming in 2023)