Friday, June 02, 2006

Computing Frontiers: Why Study Quantum?

I'm just about ten days from my final defense, and while I had considered tapping the Net for people interested in commenting on my thesis, I admit I never got around to it. Here's a preview, though: the prolegomenon justifying why I think it's worthwhile for computer systems folks to be involved in quantum computing now. Comments welcome!

Chapter 1 Introduction



We are just started on a great venture. Dwight Eisenhower

The designer usually finds himself floundering in a sea of possibilities, unclear about how one choice will limit his freedom to make other choices, or affect the size and performance of the entire system. There probably isn't a best way to build the system, or even any major part of it; much more important is to avoid choosing a terrible way, and to have clear division of responsibilities among the parts. I have designed and built a number of computer systems, some that worked and some that didn't. Butler Lampson, "Hints for Computer System Design" [187]

As VLSI features continue to shrink, computers that depend on quantum mechanical effects to operate are inevitable [221, 211, 47, 143, 104]. The fundamental architectural issue in these future systems is whether they will attempt to hide this quantum substrate beneath a veneer of classical digital logic, or will expose quantum effects to the programmer, opening up the possibilities of dramatic increases in computational power [108, 89, 88, 35, 38, 279, 127, 3, 196, 233].

Small and unreliable they are, but quantum computers of up to a dozen nuclear spins [79] and eight ions [131] exist. The three most famous quantum algorithms are Deutsch-Jozsa [89], Grover's search [127], and Shor's factoring [279]. All three of these algorithms have been experimentally implemented for small-scale problems [151, 70, 68, 163, 310, 319, 320, 130]. A further extremely broad range of experiments has demonstrated numerous building blocks [326, 29, 296, 169, 224, 64, 251] based on the one- and two-qubit technology demonstrations we will see in Chapter 7. Although many theoretical and practical questions remain open, it seems reasonable to assert that implementation of quantum computation is on the verge of moving from a scientific problem to an engineering one. It is now time to ask what we can build, and what we should build. Various computer architecture researchers have begun investigating the former question, working from the bottom up [78, 146, 241, 240, 305, 145]; this dissertation and the related papers address the latter question, working from the top down [314, 317, 316, 312, 313, 315].

1.1 Computing Frontiers: Why Study Quantum?



Why should computer engineers study quantum computation, and why now? Certainly the field of classical computer architecture is not moribund, and offers far more immediate impact for much less intellectual risk. Work that increases parallelism, reduces power consumption, improves I/O performance, increases gate speed or reduces data propagation delays is much more likely to be used in the real world, and far sooner than quantum technologies. Intel began sampling a billion-transistor microprocessor chip in October 2005, a 580 square-millimeter chip built in a 90 nanometer process. Some researchers consider integration levels of a trillion transistors per silicon chip possible [213], though we are hardly done digesting the implications of a billion transistors on a chip [246, 178, 56]. Clearly there is room on-chip for many architectural advances. Ubiquitous computing, sensor networks, augmented reality, and mobile systems will no doubt be among the most transformative technologies of the coming decades, relegating today's 3G Internet-connected mobile phones to the status of Neolithic stone adzes [261]. In “back end” systems, continued research on computational grids and storage are critical. Among computing exotica, electrical circuits fabricated with nanotechnology [344, 32, 205, 304, 267], DNA computing [8], and amorphous computing are all other possible fields of pursuit [5]. So, why quantum?

Different researchers have different reasons for studying quantum computing. Physicists are learning fundamental facts about the quantum behavior of both individual particles and mesoscopic systems. Theoretical computer scientists are finding many fascinating new questions (and answering some of them). But to a computer systems person, quantum computation is about one thing: the pursuit of performance. If practical largescale quantum computers can be built, we may be able to solve important problems that are classically intractable. Potential applications include cryptographically important functions such as factoring, which appears to offer a superpolynomial speedup, and scientifically important problems such as simulations of many-body quantum systems, which may offer exponential speedup. Quantum computers therefore hold out the possibility of not just Moore's Law increases in speed, but a change in computational complexity class and consequent acceleration on these, and possibly other, problems.

I will not directly address criticisms of the possibility of quantum computation [98, 158], except to note that my response is different from that of Aaronson, who is excited by the inherent beauty and theoretical importance of quantum mechanics while searching for the ultimate limits to computation [3]. I, too, admire these factors, but more importantly I believe it is inevitable, as silicon devices continue to scale down in size, that we will have to deal with quantum effects. Many researchers are directing their efforts at mitigating these effects; in my opinion, we will do better by embracing them, even if "quantum computing" ultimately proves to have no computational advantage over classical.

Quantum effects are also being explored for direct exploitation as classical logic, such as recent work on magnetic quantum dot cellular automata [144]. Plasmonics, the study of electromagnetic waves propagating in the surface of a material, is developing rapidly, and might offer improvements in how we move data within classical chips [243]. More broadly, the whole area called spintronics, directly or indirectly manipulating the spin of small numbers of electrons, is already having an impact through the creation of technologies such as magnetic RAM (MRAM) [309, 330]. It has been suggested that classical computers must employ reversible logic to exceed 1022 floating point operations per second (10 zettaFLOPS) [86]. Quantum computation serves as an excellent training ground for engineers destined to work in these areas, as well as providing both fundamental and practical results that influence the technological development of these areas.

My analogy is to the field of robotics. It has been more than eighty years since the original use of the term robot to mean an autonomous, mechanical humanoid (though the idea goes back to antiquity) [322], and several decades since the debut of robotics as a respectable field of inquiry. Yet the humanoid robots of science fiction do not roam the streets of Tokyo in the first decade of the twenty-first century. This does not mean that robotics as a field has been barren; indeed, robots dominate many forms of manufacturing, and the related technologies spun off from robotics research are nearly ubiquitous. Robotics depends on, and serves as an impetus for, research as diverse as computer vision, speech recognition, fuzzy logic, virtual reality, and many mechanical advances. The road to development has been long, and the results to date look nothing like what mid-twentieth century science fiction writers such as Isaac Asimov anticipated, but the results have been extremely valuable nonetheless. So I expect it to be with quantum computing.

No comments: