A couple of weeks ago I was on a crowded Tokyo train, and there was an ad at the far end of the car with some interesting data. It showed a family of four travelling by car, and producing 880 liters of CO2 emissions. In contrast, travelling by train produced 92 liters.
I couldn't get close enough to the ad to read the details, and haven't seen it again. First question is what kind of assumptions they are making -- is this packed-to-the-gills subway versus stuck-in-traffic Hummer, or is it uncrowded shinkansen green car (first class) versus K car (sub-sub-compact)? My guess would be the that comparison is intended to be favorable to the train.
I'd also like to know how many kilometers, whether luggage is involved, etc. And how is the electricity to drive the train assumed to be produced? What losses are included? Does the gasoline figure include exploring, pumping, refining, transporting, and delivering the petroleum?
This comparison is worth what you're paying for it, but it's a start...anybody got pointers to a detailed analysis, ideally including shinkansen v. airplane?
Friday, June 30, 2006
Thursday, June 22, 2006
Now Available: Distributed Arithmetic on a Quantum Multicomputer
Our ISCA paper, "Distributed Arithmetic on a Quantum Multicomputer", and our JETC paper, "Architectural Implications of Quantum Computing Technologies", are now both available on the Aqua Project publications page.
500GHz Transistor
John Cressler's research group at Georgia Tech and IBM have created an SiGe chip that runs at 500GHz. They achieved this by cooling the device to 4.5K. But it wasn't a slow technology to start with, having a room-temperature switching speed of 350GHz.
If I did this right, Landauer's Limit says that at 350GHz of bit destructions @300K, each transistor would generate 1.5nanowatts. So, a billion-transistor chip would have a physical, fundamental limit of at least 1.5 watts, unless reversible computing is used. Still, we're clearly orders of magnitude from that limit... (I may have dropped a small constant there, but I think I'm within a factor of two.)
If I did this right, Landauer's Limit says that at 350GHz of bit destructions @300K, each transistor would generate 1.5nanowatts. So, a billion-transistor chip would have a physical, fundamental limit of at least 1.5 watts, unless reversible computing is used. Still, we're clearly orders of magnitude from that limit... (I may have dropped a small constant there, but I think I'm within a factor of two.)
Computer Architecture Letters & TC
In my list of fast turnaround architecture-related journals, I should have mentioned Computer Architecture Letters. Four-page letters, acceptance rate currently 22%, an official IEEE Computer Society journal, a breathtakingly strong editorial board, and good papers.
It's also worth mentioning, for those of you in quantum computing, that IEEE Transactions on Computers now has Todd Brun on its editorial board; I think this makes it the first journal with a significant architecture component (though not focus) to have someone strong in quantum computing on board. Todd also seems to be very conscientious about getting papers reviewed in a timely fashion.
It's also worth mentioning, for those of you in quantum computing, that IEEE Transactions on Computers now has Todd Brun on its editorial board; I think this makes it the first journal with a significant architecture component (though not focus) to have someone strong in quantum computing on board. Todd also seems to be very conscientious about getting papers reviewed in a timely fashion.
Friday, June 16, 2006
ISCA and Importance of Conferences
I'm leaving momentarily for the International Symposium on Computer Architecture, in Boston this year. Although Mark Oskin holds the honor of authoring the first ISCA quantum computing paper, I believe, in 2003, this will be the first time there is a full session on the topic. Three papers, one from Berkeley, one from a mixed group including Davis, Santa Barbara, and MIT, and ours (Keio, HP Labs, and NII), will be presented on the last day of the conference (Wednesday). A good performance by the presenters will go a long way to convincing the architecture community that there is important and interesting work to be done, and that direct involvement of architecture folks will accelerate the arrival of viable quantum computers.
For you physicists who are still learning about the CS conference circuit, the top CS conferences review and publish full papers only, and are very competitive (I think ISCA's acceptance rate this year was 14%). Journals are important, too, but in many ways less so. Of the top ten most-cited venues in 2003, eight are conferences and two are journals, according to CiteSeer. I think CiteSeer's publication delay list is very suspect, but it gives you an idea -- conferences are much shorter. Many ACM and IEEE journals and transactions average more than a year to return of first reviews, let alone final publication. Recognizing that this is a problem, many of the newer journals, such as JETC, JILP (well, at seven years, JILP might not count as "new"), and TACO are working hard to keep turnaround time for reviews down. But the dialog on open access is, I think, further advanced in physics than in CS, which is not what should have happened -- IMHO, CS should have been the leader on this topic.
Anyway, I'll try to blog from ISCA. The program this year looks exciting, though interestingly, there are no storage papers this year. Perhaps FAST and the new Transactions on Storage are getting the best storage papers these days?
For you physicists who are still learning about the CS conference circuit, the top CS conferences review and publish full papers only, and are very competitive (I think ISCA's acceptance rate this year was 14%). Journals are important, too, but in many ways less so. Of the top ten most-cited venues in 2003, eight are conferences and two are journals, according to CiteSeer. I think CiteSeer's publication delay list is very suspect, but it gives you an idea -- conferences are much shorter. Many ACM and IEEE journals and transactions average more than a year to return of first reviews, let alone final publication. Recognizing that this is a problem, many of the newer journals, such as JETC, JILP (well, at seven years, JILP might not count as "new"), and TACO are working hard to keep turnaround time for reviews down. But the dialog on open access is, I think, further advanced in physics than in CS, which is not what should have happened -- IMHO, CS should have been the leader on this topic.
Anyway, I'll try to blog from ISCA. The program this year looks exciting, though interestingly, there are no storage papers this year. Perhaps FAST and the new Transactions on Storage are getting the best storage papers these days?
Gedo no Senki
If you're a fan of Ursula K. Le Guin's Earthsea (I am) and the Studio Ghibli anime films (I am), you're probably excited by the upcoming "Gedo no Senki". Out next month, directed by Miyazaki Goro, son of the great Miyazaki Hayao.
The trailer is out on Google Video, and Le Guin's website has a synopsis. Le Guin says she hasn't seen the film and won't comment until she does, but that's such an exceedingly cautious and neutral comment that it makes me wonder if she has doubts.
The official web page requires Flash, and fails to detect that I have it installed for some reason.
The trailer is out on Google Video, and Le Guin's website has a synopsis. Le Guin says she hasn't seen the film and won't comment until she does, but that's such an exceedingly cautious and neutral comment that it makes me wonder if she has doubts.
The official web page requires Flash, and fails to detect that I have it installed for some reason.
Pharmaceuticals in Japan
A blurb in the Daily Yomiuri this morning led me to a report by the Office of Pharmaceutical Industry Research that says that 28 of the top 88 best-selling drugs in the world are not currently available in Japan.
The report is in Japanese, and there's a lot of specialized vocabulary I don't grok at a glance, but if I'm reading it right, Japan lags an average of 2.5 years behind the U.S. in approving drugs, with an average of 1,400 days to approval, compared to 500 for the U.S. (I have absolutely no idea what starts the clock on that approval, and I consider it quite possible that a difference in bureaucratic procedures means the reality is either better or worse.) Okay, wait, if I read this right, that's time from approval in the drug's home country to approval in the local country. That is, a European drug would appear on the U.S. market 500 days after appearing in Europe, and it would appear 1,400 days later in Japan than in Europe.
Of the top 95 drugs worldwide, 38 originated in the U.S., 14 in the U.K., 13 in Japan, 12 in Switzerland, 5 in France, 3 each in Germany, Sweden, and Denmark, and 1 each in Belgium, Israel, and Croatia.
In an unrelated article in the paper, a survey found that 35% of obstetrics departments in Japan have actually stopped delivering babies. More than a third of obstetricians are aged 60 or older, which is nominally retirement age here.
The report is in Japanese, and there's a lot of specialized vocabulary I don't grok at a glance, but if I'm reading it right, Japan lags an average of 2.5 years behind the U.S. in approving drugs, with an average of 1,400 days to approval, compared to 500 for the U.S. (I have absolutely no idea what starts the clock on that approval, and I consider it quite possible that a difference in bureaucratic procedures means the reality is either better or worse.) Okay, wait, if I read this right, that's time from approval in the drug's home country to approval in the local country. That is, a European drug would appear on the U.S. market 500 days after appearing in Europe, and it would appear 1,400 days later in Japan than in Europe.
Of the top 95 drugs worldwide, 38 originated in the U.S., 14 in the U.K., 13 in Japan, 12 in Switzerland, 5 in France, 3 each in Germany, Sweden, and Denmark, and 1 each in Belgium, Israel, and Croatia.
In an unrelated article in the paper, a survey found that 35% of obstetrics departments in Japan have actually stopped delivering babies. More than a third of obstetricians are aged 60 or older, which is nominally retirement age here.
Thursday, June 15, 2006
Where Will You Be When the Big One Hits?
Last Sunday, the Daily Yomiuri had an article based on a report developed by the Tokyo metropolitan government. They estimate that, in the event of a large earthquake during the day, 3.9 million people will be stranded in central Tokyo, too far from home to walk and with no trains or other transport running. That's out of 11.44 million people away from home during the average day in Tokyo. If you include three neighboring prefectures, the number stranded could rise to 6.5 million people. That's a lot of people looking for dinner, some water, a place to lay their head, a toilet, a phone to borrow, and maybe some warm, dry clothes -- not to mention medical assistance.
Among statistical tidbits, it mentions that Shinjuku Station handles four million passengers a day. I've heard various figures for different stations, including (if I recall) 900,000 for Tokyo Station, which sounds low. I think the numbers vary depending on whether you're talking about people starting or terminating journeys, changing trains, or just sitting on a train passing through. With fourteen main platforms in the JR part of the station alone, and several connecting private lines (Odakyu, Keio (no relation to my university), and three subway lines), each with an average of several hundred trains a day and as many as fifteen cars with several hundred people each -- well, you do the math. Wikipedia's Shinjuku Station article (in English) says the number is 3.22 million.
The article says that 167,197 people will initially be stranded in Shinjuku (someone needs to teach them the difference between accuracy and precision -- or do they have a list of names already?), of whom more than 90,000 will be stranded until trains resume, which could be days. Designated emergency gathering points include major parks, but if it's raining and cold, you're not going to convince people to stay there for long.
Number of people initially expected to be stranded:
The percentage of people expected to be stranded until train service resumes varies:
The article also says that on the busiest streets, the density of people may peak at 11 people per square meter 3-4 hours after the quake. That compares to 10 people per square meter on a very crowded train. Seems unlikely to me, but the point that it's going to crowded when everyone comes out of their offices and tries to walk home at the same time is quite valid.
I have some doubts about the numbers, but bottom line: when I get back from ISCA, an earthquake kit for the office is high on my priority list. It's way overdue already.
Among statistical tidbits, it mentions that Shinjuku Station handles four million passengers a day. I've heard various figures for different stations, including (if I recall) 900,000 for Tokyo Station, which sounds low. I think the numbers vary depending on whether you're talking about people starting or terminating journeys, changing trains, or just sitting on a train passing through. With fourteen main platforms in the JR part of the station alone, and several connecting private lines (Odakyu, Keio (no relation to my university), and three subway lines), each with an average of several hundred trains a day and as many as fifteen cars with several hundred people each -- well, you do the math. Wikipedia's Shinjuku Station article (in English) says the number is 3.22 million.
The article says that 167,197 people will initially be stranded in Shinjuku (someone needs to teach them the difference between accuracy and precision -- or do they have a list of names already?), of whom more than 90,000 will be stranded until trains resume, which could be days. Designated emergency gathering points include major parks, but if it's raining and cold, you're not going to convince people to stay there for long.
Number of people initially expected to be stranded:
Tokyo | 198,309 |
Shibuya | 182,858 |
Shinjuku | 167,197 |
Ikebukuro | 165,733 |
Shinagawa | 127,864 |
Machida | 125,512 |
Ueno | 89,894 |
Hachioji | 84,528 |
The percentage of people expected to be stranded until train service resumes varies:
Station | 10-20km from home | 20+km from home | total |
---|---|---|---|
Tokyo | 31,282 | 111,146 | 142,428 |
Shibuya | 28,132 | 75,463 | 103,595 |
Shinjuku | 21,493 | 69,101 | 90,594 |
Shinagawa | 17,602 | 71,528 | 89,130 |
Ikebukuro | 22,101 | 62,663 | 84,764 |
Ueno | 11,522 | 32,712 | 44,234 |
Machida | 10,376 | 17,920 | 28,296 |
Hachioji | 6,590 | 10,768 | 17,358 |
The article also says that on the busiest streets, the density of people may peak at 11 people per square meter 3-4 hours after the quake. That compares to 10 people per square meter on a very crowded train. Seems unlikely to me, but the point that it's going to crowded when everyone comes out of their offices and tries to walk home at the same time is quite valid.
I have some doubts about the numbers, but bottom line: when I get back from ISCA, an earthquake kit for the office is high on my priority list. It's way overdue already.
Random Numbers
The NIST Computer Security Resource Center has just published a report on how to generate good random numbers. This is critical for computer security, but if you're in any sort of physical simulation or use Monte Carlo methods, and you don't know how good your pseudo-random number generator (PRNG) is (or isn't), you should check this out. Bad random numbers will do weird things to you.
Cribbed from Bruce Schneier's blog.
Cribbed from Bruce Schneier's blog.
Tuesday, June 13, 2006
Happy 20th Anniversary
Today, June 13, 2006, is the twentieth anniversary of our graduation from a small technical school in Pasadena.
Tiger, Yosufi, Roseytoes, Min, Bobo, Harold, Myles, Ross, Michelle, Ken, Takako, Hod, and the rest, What a Long Strange Trip It's Been.
I'm sorry I missed our reunion a few weeks ago, but as you can guess from yesterday's post, I've been busy. Love to you all, and hope to see you all soon...
Tiger, Yosufi, Roseytoes, Min, Bobo, Harold, Myles, Ross, Michelle, Ken, Takako, Hod, and the rest, What a Long Strange Trip It's Been.
I'm sorry I missed our reunion a few weeks ago, but as you can guess from yesterday's post, I've been busy. Love to you all, and hope to see you all soon...
Just Call Me "Doc Shorts"
I passed my thesis defense today, so I'm now Doctor Van Meter. Or, at least, will be once the final paperwork is done; it has to be printed, bound, and submitted to an all-faculty meeting next month. My committee has some suggested changes, but they left them as recommendations, so the final tweaks are up to my adviser (Prof. Teraoka) and me, as I understand it.
My thesis is, as I've mentioned, "Architecture of a Quantum Multicomputer Optimized for Shor's Factoring Algorithm". After I make the final revisions, I should be posting it on the arxiv, so stay tuned. (If you're dying for a preview, or want to put in your two cents before it's final, email me and I'll drop you a copy.)
After a lot of work and worry, the defense itself was actually remarkably uneventful. I didn't count, but there were about twenty people there, counting the professors, which is a pretty good crowd. My committee asked a number of intelligent questions, but the grilling was moderate, compared to the first presentation a month ago, when I got shelled pretty good. Thanks to the help of my advisers, the presentation got a lot better in the interim.
My committee was five professors, plus one unofficial member. That's a crowd, but in such an interdisciplinary topic it seemed kind of necessary; I had a theoretical physicist (Prof. Kae Nemoto, who will be my boss in my postdoc), an experimentalist (Prof. Kohei Itoh), a computer networking guy (Prof. Teraoka, my official adviser), one parallel systems expert (Prof. Amano), and another architecture guy (Prof. Yamasaki). The unofficial member is Prof. Jun Murai, who is now Keio's vice president in charge of research -- provost, more or less. I'm very flattered that he took the time to attend. Thanks to all of you for the hard work in mentoring me and evaluating my thesis.
Ah, the one real event of the day, which will no doubt go down in history as part of the Rod Legend. Defenses, in this country, are always given wearing a suit. But I hate suits, and the weather is now hot, sticky, and rainy, so I took my suit, still in the dry-cleaning plastic, and shoved it in a bag, and commuted the two hours to campus in shorts and a t-shirt.
Last Friday, Kae said, "On Monday, remind me to..." I interrupted her and said, "Huh-huh, I'm not taking responsibility for that. On Monday, if I remember to show up carrying my laptop and wearing pants, I'll be happy."
Do you see the punchline yet?
I got to campus, opened my bag, and... no pants. I had grabbed the suit coat, but the pants came from the dry cleaners packaged separately. Oops.
Fortunately, that was still early in the morning, Mayumi was coming to hear my defense, and she hadn't left the house yet. I called her, and she brought them with her, and everything worked out.
So, you can call me "Doc Shorts" or "Doctor No Pants" (or just "Doctor No"?). (Beats some of the nicknames Wook has had for me over the years, but I suppose I've reinforced rather than outgrown "Bean Dip".)
Thanks to all of you who have given me so much support over the years. I couldn't have done it without you.
My thesis is, as I've mentioned, "Architecture of a Quantum Multicomputer Optimized for Shor's Factoring Algorithm". After I make the final revisions, I should be posting it on the arxiv, so stay tuned. (If you're dying for a preview, or want to put in your two cents before it's final, email me and I'll drop you a copy.)
After a lot of work and worry, the defense itself was actually remarkably uneventful. I didn't count, but there were about twenty people there, counting the professors, which is a pretty good crowd. My committee asked a number of intelligent questions, but the grilling was moderate, compared to the first presentation a month ago, when I got shelled pretty good. Thanks to the help of my advisers, the presentation got a lot better in the interim.
My committee was five professors, plus one unofficial member. That's a crowd, but in such an interdisciplinary topic it seemed kind of necessary; I had a theoretical physicist (Prof. Kae Nemoto, who will be my boss in my postdoc), an experimentalist (Prof. Kohei Itoh), a computer networking guy (Prof. Teraoka, my official adviser), one parallel systems expert (Prof. Amano), and another architecture guy (Prof. Yamasaki). The unofficial member is Prof. Jun Murai, who is now Keio's vice president in charge of research -- provost, more or less. I'm very flattered that he took the time to attend. Thanks to all of you for the hard work in mentoring me and evaluating my thesis.
Ah, the one real event of the day, which will no doubt go down in history as part of the Rod Legend. Defenses, in this country, are always given wearing a suit. But I hate suits, and the weather is now hot, sticky, and rainy, so I took my suit, still in the dry-cleaning plastic, and shoved it in a bag, and commuted the two hours to campus in shorts and a t-shirt.
Last Friday, Kae said, "On Monday, remind me to..." I interrupted her and said, "Huh-huh, I'm not taking responsibility for that. On Monday, if I remember to show up carrying my laptop and wearing pants, I'll be happy."
Do you see the punchline yet?
I got to campus, opened my bag, and... no pants. I had grabbed the suit coat, but the pants came from the dry cleaners packaged separately. Oops.
Fortunately, that was still early in the morning, Mayumi was coming to hear my defense, and she hadn't left the house yet. I called her, and she brought them with her, and everything worked out.
So, you can call me "Doc Shorts" or "Doctor No Pants" (or just "Doctor No"?). (Beats some of the nicknames Wook has had for me over the years, but I suppose I've reinforced rather than outgrown "Bean Dip".)
Thanks to all of you who have given me so much support over the years. I couldn't have done it without you.
Friday, June 02, 2006
Dang, I Forgot to Set the VCR!
So, you meant to record that World Cup match taking place while you're at work. But, you forgot to set up the recorder. No problem, if you have a Sony-Ericsson SO902iS keitai (cell phone) and one of several Sony models of high-definition video recorders (hard disk or DVD either one, I think). Just call up the G-guide website on your cell phone, find the right program, and click a couple of buttons, and your phone contacts your video recorder via the Internet, and sets it to record for you.
You have to set up the service beforehand, but it's free. Connect to the program guide website, do some setup there, set a password on your DVR (digital video recorder), then have your phone connect to the DVR via infrared the first time. Presumably they exchange enough information then to find and identify each other via the Internet when you're away from the house.
Obviously, this means your DVR has to be connected to the Internet. Details of that are vague, but my first question is how NAT is dealt with. Does your DVR have to have a permanent, non-NATed, global address? Or does it get around this by polling some shared server frequently? Or, do you have to convince your ISP to set up a passthrough in the NAT so that you can connect to your DVR? Ugh. Just switch to IPv6, man (of course, your home or ISP firewall would still have to be smart enough to let the right traffic through...).
You have to set up the service beforehand, but it's free. Connect to the program guide website, do some setup there, set a password on your DVR (digital video recorder), then have your phone connect to the DVR via infrared the first time. Presumably they exchange enough information then to find and identify each other via the Internet when you're away from the house.
Obviously, this means your DVR has to be connected to the Internet. Details of that are vague, but my first question is how NAT is dealt with. Does your DVR have to have a permanent, non-NATed, global address? Or does it get around this by polling some shared server frequently? Or, do you have to convince your ISP to set up a passthrough in the NAT so that you can connect to your DVR? Ugh. Just switch to IPv6, man (of course, your home or ISP firewall would still have to be smart enough to let the right traffic through...).
HSDPA Handsets
While digging for something else I'll blog about shortly, I ran across an announcement of the N902iX, an NTT DoCoMo FOMA 3G/HSDPA handset. What does that mean to you? If your 3G cell phone network is built out for it, and increasing numbers of them are, you can, in theory, get up to 3.6 Mbps to your cell phone, compared to the nominal 384kbps of straight WCDMA. Mostly this has been available as PCMCIA cards for laptops; the claim is that this is the first true HSDPA handset.
Speaking of which, apparently a good fraction of the FOMA handsets now available include triband GSM. My N900iG, bought in January 2005, was the first such handset (and suitably buggy and power-hungry), but now DoCoMo is saying that within two years they will require all of the FOMA handsets to support GSM. The announcement I saw didn't say which frequencies will be mandated, but I think the ones in use are still proliferating worldwide...
Speaking of which, apparently a good fraction of the FOMA handsets now available include triband GSM. My N900iG, bought in January 2005, was the first such handset (and suitably buggy and power-hungry), but now DoCoMo is saying that within two years they will require all of the FOMA handsets to support GSM. The announcement I saw didn't say which frequencies will be mandated, but I think the ones in use are still proliferating worldwide...
Computing Frontiers: Why Study Quantum?
I'm just about ten days from my final defense, and while I had considered tapping the Net for people interested in commenting on my thesis, I admit I never got around to it. Here's a preview, though: the prolegomenon justifying why I think it's worthwhile for computer systems folks to be involved in quantum computing now. Comments welcome!
We are just started on a great venture. Dwight Eisenhower
The designer usually finds himself floundering in a sea of possibilities, unclear about how one choice will limit his freedom to make other choices, or affect the size and performance of the entire system. There probably isn't a best way to build the system, or even any major part of it; much more important is to avoid choosing a terrible way, and to have clear division of responsibilities among the parts. I have designed and built a number of computer systems, some that worked and some that didn't. Butler Lampson, "Hints for Computer System Design" [187]
As VLSI features continue to shrink, computers that depend on quantum mechanical effects to operate are inevitable [221, 211, 47, 143, 104]. The fundamental architectural issue in these future systems is whether they will attempt to hide this quantum substrate beneath a veneer of classical digital logic, or will expose quantum effects to the programmer, opening up the possibilities of dramatic increases in computational power [108, 89, 88, 35, 38, 279, 127, 3, 196, 233].
Small and unreliable they are, but quantum computers of up to a dozen nuclear spins [79] and eight ions [131] exist. The three most famous quantum algorithms are Deutsch-Jozsa [89], Grover's search [127], and Shor's factoring [279]. All three of these algorithms have been experimentally implemented for small-scale problems [151, 70, 68, 163, 310, 319, 320, 130]. A further extremely broad range of experiments has demonstrated numerous building blocks [326, 29, 296, 169, 224, 64, 251] based on the one- and two-qubit technology demonstrations we will see in Chapter 7. Although many theoretical and practical questions remain open, it seems reasonable to assert that implementation of quantum computation is on the verge of moving from a scientific problem to an engineering one. It is now time to ask what we can build, and what we should build. Various computer architecture researchers have begun investigating the former question, working from the bottom up [78, 146, 241, 240, 305, 145]; this dissertation and the related papers address the latter question, working from the top down [314, 317, 316, 312, 313, 315].
Why should computer engineers study quantum computation, and why now? Certainly the field of classical computer architecture is not moribund, and offers far more immediate impact for much less intellectual risk. Work that increases parallelism, reduces power consumption, improves I/O performance, increases gate speed or reduces data propagation delays is much more likely to be used in the real world, and far sooner than quantum technologies. Intel began sampling a billion-transistor microprocessor chip in October 2005, a 580 square-millimeter chip built in a 90 nanometer process. Some researchers consider integration levels of a trillion transistors per silicon chip possible [213], though we are hardly done digesting the implications of a billion transistors on a chip [246, 178, 56]. Clearly there is room on-chip for many architectural advances. Ubiquitous computing, sensor networks, augmented reality, and mobile systems will no doubt be among the most transformative technologies of the coming decades, relegating today's 3G Internet-connected mobile phones to the status of Neolithic stone adzes [261]. In âback endâ systems, continued research on computational grids and storage are critical. Among computing exotica, electrical circuits fabricated with nanotechnology [344, 32, 205, 304, 267], DNA computing [8], and amorphous computing are all other possible fields of pursuit [5]. So, why quantum?
Different researchers have different reasons for studying quantum computing. Physicists are learning fundamental facts about the quantum behavior of both individual particles and mesoscopic systems. Theoretical computer scientists are finding many fascinating new questions (and answering some of them). But to a computer systems person, quantum computation is about one thing: the pursuit of performance. If practical largescale quantum computers can be built, we may be able to solve important problems that are classically intractable. Potential applications include cryptographically important functions such as factoring, which appears to offer a superpolynomial speedup, and scientifically important problems such as simulations of many-body quantum systems, which may offer exponential speedup. Quantum computers therefore hold out the possibility of not just Moore's Law increases in speed, but a change in computational complexity class and consequent acceleration on these, and possibly other, problems.
I will not directly address criticisms of the possibility of quantum computation [98, 158], except to note that my response is different from that of Aaronson, who is excited by the inherent beauty and theoretical importance of quantum mechanics while searching for the ultimate limits to computation [3]. I, too, admire these factors, but more importantly I believe it is inevitable, as silicon devices continue to scale down in size, that we will have to deal with quantum effects. Many researchers are directing their efforts at mitigating these effects; in my opinion, we will do better by embracing them, even if "quantum computing" ultimately proves to have no computational advantage over classical.
Quantum effects are also being explored for direct exploitation as classical logic, such as recent work on magnetic quantum dot cellular automata [144]. Plasmonics, the study of electromagnetic waves propagating in the surface of a material, is developing rapidly, and might offer improvements in how we move data within classical chips [243]. More broadly, the whole area called spintronics, directly or indirectly manipulating the spin of small numbers of electrons, is already having an impact through the creation of technologies such as magnetic RAM (MRAM) [309, 330]. It has been suggested that classical computers must employ reversible logic to exceed 1022 floating point operations per second (10 zettaFLOPS) [86]. Quantum computation serves as an excellent training ground for engineers destined to work in these areas, as well as providing both fundamental and practical results that influence the technological development of these areas.
My analogy is to the field of robotics. It has been more than eighty years since the original use of the term robot to mean an autonomous, mechanical humanoid (though the idea goes back to antiquity) [322], and several decades since the debut of robotics as a respectable field of inquiry. Yet the humanoid robots of science fiction do not roam the streets of Tokyo in the first decade of the twenty-first century. This does not mean that robotics as a field has been barren; indeed, robots dominate many forms of manufacturing, and the related technologies spun off from robotics research are nearly ubiquitous. Robotics depends on, and serves as an impetus for, research as diverse as computer vision, speech recognition, fuzzy logic, virtual reality, and many mechanical advances. The road to development has been long, and the results to date look nothing like what mid-twentieth century science fiction writers such as Isaac Asimov anticipated, but the results have been extremely valuable nonetheless. So I expect it to be with quantum computing.
Chapter 1 Introduction
We are just started on a great venture. Dwight Eisenhower
The designer usually finds himself floundering in a sea of possibilities, unclear about how one choice will limit his freedom to make other choices, or affect the size and performance of the entire system. There probably isn't a best way to build the system, or even any major part of it; much more important is to avoid choosing a terrible way, and to have clear division of responsibilities among the parts. I have designed and built a number of computer systems, some that worked and some that didn't. Butler Lampson, "Hints for Computer System Design" [187]
As VLSI features continue to shrink, computers that depend on quantum mechanical effects to operate are inevitable [221, 211, 47, 143, 104]. The fundamental architectural issue in these future systems is whether they will attempt to hide this quantum substrate beneath a veneer of classical digital logic, or will expose quantum effects to the programmer, opening up the possibilities of dramatic increases in computational power [108, 89, 88, 35, 38, 279, 127, 3, 196, 233].
Small and unreliable they are, but quantum computers of up to a dozen nuclear spins [79] and eight ions [131] exist. The three most famous quantum algorithms are Deutsch-Jozsa [89], Grover's search [127], and Shor's factoring [279]. All three of these algorithms have been experimentally implemented for small-scale problems [151, 70, 68, 163, 310, 319, 320, 130]. A further extremely broad range of experiments has demonstrated numerous building blocks [326, 29, 296, 169, 224, 64, 251] based on the one- and two-qubit technology demonstrations we will see in Chapter 7. Although many theoretical and practical questions remain open, it seems reasonable to assert that implementation of quantum computation is on the verge of moving from a scientific problem to an engineering one. It is now time to ask what we can build, and what we should build. Various computer architecture researchers have begun investigating the former question, working from the bottom up [78, 146, 241, 240, 305, 145]; this dissertation and the related papers address the latter question, working from the top down [314, 317, 316, 312, 313, 315].
1.1 Computing Frontiers: Why Study Quantum?
Why should computer engineers study quantum computation, and why now? Certainly the field of classical computer architecture is not moribund, and offers far more immediate impact for much less intellectual risk. Work that increases parallelism, reduces power consumption, improves I/O performance, increases gate speed or reduces data propagation delays is much more likely to be used in the real world, and far sooner than quantum technologies. Intel began sampling a billion-transistor microprocessor chip in October 2005, a 580 square-millimeter chip built in a 90 nanometer process. Some researchers consider integration levels of a trillion transistors per silicon chip possible [213], though we are hardly done digesting the implications of a billion transistors on a chip [246, 178, 56]. Clearly there is room on-chip for many architectural advances. Ubiquitous computing, sensor networks, augmented reality, and mobile systems will no doubt be among the most transformative technologies of the coming decades, relegating today's 3G Internet-connected mobile phones to the status of Neolithic stone adzes [261]. In âback endâ systems, continued research on computational grids and storage are critical. Among computing exotica, electrical circuits fabricated with nanotechnology [344, 32, 205, 304, 267], DNA computing [8], and amorphous computing are all other possible fields of pursuit [5]. So, why quantum?
Different researchers have different reasons for studying quantum computing. Physicists are learning fundamental facts about the quantum behavior of both individual particles and mesoscopic systems. Theoretical computer scientists are finding many fascinating new questions (and answering some of them). But to a computer systems person, quantum computation is about one thing: the pursuit of performance. If practical largescale quantum computers can be built, we may be able to solve important problems that are classically intractable. Potential applications include cryptographically important functions such as factoring, which appears to offer a superpolynomial speedup, and scientifically important problems such as simulations of many-body quantum systems, which may offer exponential speedup. Quantum computers therefore hold out the possibility of not just Moore's Law increases in speed, but a change in computational complexity class and consequent acceleration on these, and possibly other, problems.
I will not directly address criticisms of the possibility of quantum computation [98, 158], except to note that my response is different from that of Aaronson, who is excited by the inherent beauty and theoretical importance of quantum mechanics while searching for the ultimate limits to computation [3]. I, too, admire these factors, but more importantly I believe it is inevitable, as silicon devices continue to scale down in size, that we will have to deal with quantum effects. Many researchers are directing their efforts at mitigating these effects; in my opinion, we will do better by embracing them, even if "quantum computing" ultimately proves to have no computational advantage over classical.
Quantum effects are also being explored for direct exploitation as classical logic, such as recent work on magnetic quantum dot cellular automata [144]. Plasmonics, the study of electromagnetic waves propagating in the surface of a material, is developing rapidly, and might offer improvements in how we move data within classical chips [243]. More broadly, the whole area called spintronics, directly or indirectly manipulating the spin of small numbers of electrons, is already having an impact through the creation of technologies such as magnetic RAM (MRAM) [309, 330]. It has been suggested that classical computers must employ reversible logic to exceed 1022 floating point operations per second (10 zettaFLOPS) [86]. Quantum computation serves as an excellent training ground for engineers destined to work in these areas, as well as providing both fundamental and practical results that influence the technological development of these areas.
My analogy is to the field of robotics. It has been more than eighty years since the original use of the term robot to mean an autonomous, mechanical humanoid (though the idea goes back to antiquity) [322], and several decades since the debut of robotics as a respectable field of inquiry. Yet the humanoid robots of science fiction do not roam the streets of Tokyo in the first decade of the twenty-first century. This does not mean that robotics as a field has been barren; indeed, robots dominate many forms of manufacturing, and the related technologies spun off from robotics research are nearly ubiquitous. Robotics depends on, and serves as an impetus for, research as diverse as computer vision, speech recognition, fuzzy logic, virtual reality, and many mechanical advances. The road to development has been long, and the results to date look nothing like what mid-twentieth century science fiction writers such as Isaac Asimov anticipated, but the results have been extremely valuable nonetheless. So I expect it to be with quantum computing.
Subscribe to:
Posts (Atom)