(This posting is primarily for readers in the Tokyo area, but if you're into jazz, you might want to find the CDs anyway.)
Robbie Belgrade is a pal of mine. We met a couple of years ago on the way to Kodo's Earth Celebration. Robbie's a professional jazz musician, playing percussion and woodwinds. He studied tablas under Zakir Hussain, one of the most influential tablas players in the world.
Robbie's band Tatopani (which means "hot water" in Nepali) will be playing a release party for their new CD "Azure" in Roppongi on 11/24.
I haven't heard this particular band; like most jazz musicians, Robbie plays with many groups. The times I've heard him play have been quite enjoyable. The instrumentation is often very "world music"; in addition to Robbie's tablas, pandeiro and assorted percussion, the leader of Candela plays the shakuhachi (Japanese flute), for example, but the tunes are mostly pretty straight up jazz. Makes for pleasant sounds outside the usual tonal palette. The bands have been pretty tight, and the solos thoughtfully constructed (and sometimes humorous).
Tokyo, in general, is a good place to hear live jazz. Come out and support live music. Hope to see you there...
Sunday, October 30, 2005
Friday, October 28, 2005
MS+S2006: quantum computing conference in Japan
I just got a flier in the mail for MS+S2006, the International Symposium on Mesoscopic Superconductivity and Spintronics, sponsored by NTT and JST.
Sounds esoteric, but I went in 2004 (it's biannual), and about a third of the presentations and posters were explicitly quantum information processing, and at least another third were of some interest to QIP folks; last year the subtitle was even "In the light of quantum computation". (My paper was a somewhat abstract one on balancing quantum and classical computation, for example.)
It's a physics conference, so acceptances are based on abstracts submitted shortly before the conference, and many of the participants who aren't selected for the oral program present posters. The proceedings, with full papers, are published long after the conference. This all seems odd to those of us used to ACM and USENIX conferences, where full papers are reviewed, acceptance is extremely competitive for many of the conferences, and full proceedings are handed out at the conference itself.
NTT does an excellent job of hosting the conference, and I met a number of important researchers there (Martinis, Lloyd, Tsai, Nakamura, Mooij, to name a few); they get quite the list of attendees and speakers.
Abstracts due Jan. 23, conference Feb. 27-Mar. 2.
Anyway, consider coming, and I look forward to seeing you there!
Sounds esoteric, but I went in 2004 (it's biannual), and about a third of the presentations and posters were explicitly quantum information processing, and at least another third were of some interest to QIP folks; last year the subtitle was even "In the light of quantum computation". (My paper was a somewhat abstract one on balancing quantum and classical computation, for example.)
It's a physics conference, so acceptances are based on abstracts submitted shortly before the conference, and many of the participants who aren't selected for the oral program present posters. The proceedings, with full papers, are published long after the conference. This all seems odd to those of us used to ACM and USENIX conferences, where full papers are reviewed, acceptance is extremely competitive for many of the conferences, and full proceedings are handed out at the conference itself.
NTT does an excellent job of hosting the conference, and I met a number of important researchers there (Martinis, Lloyd, Tsai, Nakamura, Mooij, to name a few); they get quite the list of attendees and speakers.
Abstracts due Jan. 23, conference Feb. 27-Mar. 2.
Anyway, consider coming, and I look forward to seeing you there!
Wednesday, October 26, 2005
Bad News and Good News
No, nothing personal or professional - Bad News is the name of Tom Fenton's book. It's a rant about the decline of the media, especially broadcast news. It's a pretty good rant, but as a proposal for fixing the system, leaves a lot to be desired.
Fenton is a (now retired) foreign correspondent for CBS News. He's from the heyday of foreign correspondents - the 1970s - when money wasn't an object, and foreign correspondents were respected and enabled inside their organizations. He rants about all the usual suspects - the discovery that news can make money, the rise of the bean counters, sycophantic White House reporters, lazy reporters, dumb (or myopic) producers, and a public that would rather be entertained than informed.
Fenton believes, and I agree, that foreign news is incredibly important and that it's drastically underplayed in American media, even including the New York Times, but especially broadcast television news, which is where he spent his career. I also agree that there's no substitute for boots (or birkenstocks or galoshes or whatever foreign correspondents wear) on the ground, and that building an understanding of a culture takes time and effort - years, in fact, maybe decades.
But his proposed solution is backwards-looking. He mentions bloggers several times, talks about how laptops and DVD camcorders make filing reports from the field cheaper. But he demonstrates in his proposals that he Just Doesn't Get It about how completely the world has changed. I'm not talking about U.S. politics, geopolitics, or the news business, I'm talking about the technology.
I'm not totally enamored of all the things the brave new world we're creating bring us (see David Brin's Transparent Society), but I'm pretty certain that a great many of them are going to happen anyway. And most of the ones affecting news are going to be good (I've got a good rant on how narrowcasting means that the world doesn't appear the same to any two people, but we'll do that one another time). The problem is, Fenton has no particular plans to take advantage of anything but the most basic advances.
Fenton's proposals amount to, roughly, three things:
That's about it. None of them are bad things (and all have been tried, to one extent or another), but none of them demonstrate the least grasp of what is about to happen to media over the next decade or two.
For an extremely crude idea, see, say, the New York Times' coverage of Harriet Miers' nomination to the Supreme Court, or CNN's web site every day. Text stories, photos, multimedia slide shows with audio, video clips, and links to background data, both inside and outside their own websites. (Don't get me started ranting about proprietary data formats.)
Now project out a decade (don't put too much faith into that timeline). How about this:
It's six ten, and you sit down in your car to drive home (we'll assume one of those old-fashioned drive-it-yourself cars). It's a forty minute commute, and you always do the evening news. You don't sweat the fact that the broadcast started ten minutes ago, your in-car box time shifts. You get forty minutes of radio, pausing to take a phone call and replaying part of one story, so you only cover thirty minutes of the broadcast.
You pull into the driveway, go inside the house and start cooking dinner. The box in the kitchen offers to pick up where you left off on the radio broadcast and give you the last half hour, or go back and show you the video clips associated with the stories you heard in the car. You elect the video clips; the box plays them with abbreviated commentary, since it has kept track of what you've already heard. You take a couple of the offered backgrounders, getting several minutes on, say, the political flap over Koizumi's visits to Yasukuni Shrine back when China and Japan were rivals. Partway through, your spouse comes in with kids, so you switch it off.
After the kids are in bed, as you're getting ready for bed, you catch the last few minutes. You tell the box to compress down to five minutes, you're tired. It edits down several stories, including a brief update on one of the ones you heard earlier.
Would I commit an hour a day to media like that? You bet. I don't do much TV, but I probably spend better than an hour a day with the Daily Yomiuri and the New York Times, my two top sources.
Fenton needs to realize that the network news broadcast is an anachronism. It'll be gone before he cashes his first social security check. Here's what I think a news package delivered to the viewer needs:
There have been lots of other discussions on how searchable and customizable it has to be; I won't go into those. Nick Negroponte has been talking since the founding of the Media Lab about viewing a baseball game from the point of view of the baseball; toss that in, too.
What does a news organization need to provide this kind of content? A few tools, especially for creating the rich, variable depth. Nothing I don't expect to come along - advertisers will enjoy having that at least as much as news producers. Those are technical considerations, what about the people? A foreign correspondent is no longer the source of much original content, even today, let alone a decade from now, when raw footage will coming from everyone with a cell phone with a camera. What a foreign correspondent is is a guide to the torrent of available information. He or she organizes it, makes a story out of it, culls the important material, knows who the people are. That's what takes a truly broad and deep understanding of the issue at hand. Much of that work can, in theory, be done in London or New York, but you won't build the personal relationships and gut-level feel for the places, people and events without being there.
Anyway, just a few thoughts about the future of the news. Fenton has lots of worthwhile anecdotes on how sordid the behavior is on all sides of the Middle East and Central Asia theaters, but I'll leave those for another time, too.
Fenton is a (now retired) foreign correspondent for CBS News. He's from the heyday of foreign correspondents - the 1970s - when money wasn't an object, and foreign correspondents were respected and enabled inside their organizations. He rants about all the usual suspects - the discovery that news can make money, the rise of the bean counters, sycophantic White House reporters, lazy reporters, dumb (or myopic) producers, and a public that would rather be entertained than informed.
Fenton believes, and I agree, that foreign news is incredibly important and that it's drastically underplayed in American media, even including the New York Times, but especially broadcast television news, which is where he spent his career. I also agree that there's no substitute for boots (or birkenstocks or galoshes or whatever foreign correspondents wear) on the ground, and that building an understanding of a culture takes time and effort - years, in fact, maybe decades.
But his proposed solution is backwards-looking. He mentions bloggers several times, talks about how laptops and DVD camcorders make filing reports from the field cheaper. But he demonstrates in his proposals that he Just Doesn't Get It about how completely the world has changed. I'm not talking about U.S. politics, geopolitics, or the news business, I'm talking about the technology.
I'm not totally enamored of all the things the brave new world we're creating bring us (see David Brin's Transparent Society), but I'm pretty certain that a great many of them are going to happen anyway. And most of the ones affecting news are going to be good (I've got a good rant on how narrowcasting means that the world doesn't appear the same to any two people, but we'll do that one another time). The problem is, Fenton has no particular plans to take advantage of anything but the most basic advances.
Fenton's proposals amount to, roughly, three things:
- Recreate the foreign bureaus of the major news organizations.
- Build a grassroots organization that will monitor and push for quality of news reporting.
- Extend the nightly news to an hour, so there's room for in-depth reporting.
That's about it. None of them are bad things (and all have been tried, to one extent or another), but none of them demonstrate the least grasp of what is about to happen to media over the next decade or two.
For an extremely crude idea, see, say, the New York Times' coverage of Harriet Miers' nomination to the Supreme Court, or CNN's web site every day. Text stories, photos, multimedia slide shows with audio, video clips, and links to background data, both inside and outside their own websites. (Don't get me started ranting about proprietary data formats.)
Now project out a decade (don't put too much faith into that timeline). How about this:
It's six ten, and you sit down in your car to drive home (we'll assume one of those old-fashioned drive-it-yourself cars). It's a forty minute commute, and you always do the evening news. You don't sweat the fact that the broadcast started ten minutes ago, your in-car box time shifts. You get forty minutes of radio, pausing to take a phone call and replaying part of one story, so you only cover thirty minutes of the broadcast.
You pull into the driveway, go inside the house and start cooking dinner. The box in the kitchen offers to pick up where you left off on the radio broadcast and give you the last half hour, or go back and show you the video clips associated with the stories you heard in the car. You elect the video clips; the box plays them with abbreviated commentary, since it has kept track of what you've already heard. You take a couple of the offered backgrounders, getting several minutes on, say, the political flap over Koizumi's visits to Yasukuni Shrine back when China and Japan were rivals. Partway through, your spouse comes in with kids, so you switch it off.
After the kids are in bed, as you're getting ready for bed, you catch the last few minutes. You tell the box to compress down to five minutes, you're tired. It edits down several stories, including a brief update on one of the ones you heard earlier.
Would I commit an hour a day to media like that? You bet. I don't do much TV, but I probably spend better than an hour a day with the Daily Yomiuri and the New York Times, my two top sources.
Fenton needs to realize that the network news broadcast is an anachronism. It'll be gone before he cashes his first social security check. Here's what I think a news package delivered to the viewer needs:
- It must work across all media - text, images, sound, video - and all viewing and delivery platforms - HDTV, cell phone video, cell phone audio, PDA text, car radio. (Call this variable width, if you like.)
- It has to retain context as you move from box to box (car radio to kitchen TV) and session to session (e.g., updates without repeating the whole story the next morning).
- It has to offer variable depth; the same story should be available in one-, three-, and ten-minute formats. Creating and managing this variable depth is critical and difficult.
- It has to offer still more depth, including the provenance of all of the data, access to multiple viewpoints on the same events, multiple translations of foreign-language text and speech, as well as the originals. For edited video of, say, a riot, the fifteen seconds shown in the organized report should link back to the hour of raw footage in a useful fashion.
There have been lots of other discussions on how searchable and customizable it has to be; I won't go into those. Nick Negroponte has been talking since the founding of the Media Lab about viewing a baseball game from the point of view of the baseball; toss that in, too.
What does a news organization need to provide this kind of content? A few tools, especially for creating the rich, variable depth. Nothing I don't expect to come along - advertisers will enjoy having that at least as much as news producers. Those are technical considerations, what about the people? A foreign correspondent is no longer the source of much original content, even today, let alone a decade from now, when raw footage will coming from everyone with a cell phone with a camera. What a foreign correspondent is is a guide to the torrent of available information. He or she organizes it, makes a story out of it, culls the important material, knows who the people are. That's what takes a truly broad and deep understanding of the issue at hand. Much of that work can, in theory, be done in London or New York, but you won't build the personal relationships and gut-level feel for the places, people and events without being there.
Anyway, just a few thoughts about the future of the news. Fenton has lots of worthwhile anecdotes on how sordid the behavior is on all sides of the Middle East and Central Asia theaters, but I'll leave those for another time, too.
Saturday, October 22, 2005
The Scariest Thing I've Ever Read
Just in time for Halloween...
Stephen King called The Hot Zone "one of the most horrifying things I've ever read." I would hand that title instead to David Goodstein's Out of Gas (he has also lectured on this topic; you can get RealVideo here). Goodstein's concern is not the possible deaths of millions of people due to Ebola and Marburg. It's not the Bush administration paving over Yosemite. It's not even anthropogenic global warming and a piddling few degrees' increase in average temperatures. He's worried about whether or not we will run out of fossil fuels before we figure out how to live sustainably, and consequently whether civilization as we know it will come to an end.
Goodstein is a physics professor at Caltech. In the book, he heads straight for the basics: we're consuming energy that was stored in the ground long ago. We are consuming it far faster than it replenishes itself. It will run out; what are we to do when it does?
Goodstein covers the well-known Hubbert peak, and suggests that we are now at the world-wide peak of petroleum production, give or take a decade or so. When we hit that peak, we are about half way done with the amount of oil we can successfully extract and use. But that doesn't mean we have another 150 years to go before problems occur; because demand is increasing and production is flat to declining as we pass the peak, serious shortages are likely to begin within a decade or two at most. Such shortages will lead to price instability, and are especially problematic since petroleum is such a flexible, useful material. We use it in plastics, fertilizers, and more, as well as burning the stuff directly, so a shortage has the potential to threaten such fundamental societal requirements as food production.
He discusses where oil comes from, how inefficient it is to use oil shale, how natural gas, coal, and even accessible supplies of fissionable uranium will run out, and relatively soon. Each of those supplies is worth a few decades, given reasonable projections about population growth, energy consumption, and the fraction of the resource that is recoverable at reasonable economic and thermodynamic returns. All told, if we are flexible enough about our fuels, we have maybe until the end of the century. Then what?
Our only choice is fusion. We can either get our energy indirectly, by creating solar cells that ultimately derive the power they deliver from the sun's fusion, or we can figure out how to run fusion here on earth - but we haven't been very successful at that yet.
What to do, what to do?
Stephen King called The Hot Zone "one of the most horrifying things I've ever read." I would hand that title instead to David Goodstein's Out of Gas (he has also lectured on this topic; you can get RealVideo here). Goodstein's concern is not the possible deaths of millions of people due to Ebola and Marburg. It's not the Bush administration paving over Yosemite. It's not even anthropogenic global warming and a piddling few degrees' increase in average temperatures. He's worried about whether or not we will run out of fossil fuels before we figure out how to live sustainably, and consequently whether civilization as we know it will come to an end.
Goodstein is a physics professor at Caltech. In the book, he heads straight for the basics: we're consuming energy that was stored in the ground long ago. We are consuming it far faster than it replenishes itself. It will run out; what are we to do when it does?
Goodstein covers the well-known Hubbert peak, and suggests that we are now at the world-wide peak of petroleum production, give or take a decade or so. When we hit that peak, we are about half way done with the amount of oil we can successfully extract and use. But that doesn't mean we have another 150 years to go before problems occur; because demand is increasing and production is flat to declining as we pass the peak, serious shortages are likely to begin within a decade or two at most. Such shortages will lead to price instability, and are especially problematic since petroleum is such a flexible, useful material. We use it in plastics, fertilizers, and more, as well as burning the stuff directly, so a shortage has the potential to threaten such fundamental societal requirements as food production.
He discusses where oil comes from, how inefficient it is to use oil shale, how natural gas, coal, and even accessible supplies of fissionable uranium will run out, and relatively soon. Each of those supplies is worth a few decades, given reasonable projections about population growth, energy consumption, and the fraction of the resource that is recoverable at reasonable economic and thermodynamic returns. All told, if we are flexible enough about our fuels, we have maybe until the end of the century. Then what?
Our only choice is fusion. We can either get our energy indirectly, by creating solar cells that ultimately derive the power they deliver from the sun's fusion, or we can figure out how to run fusion here on earth - but we haven't been very successful at that yet.
What to do, what to do?
Friday, October 14, 2005
More on the Myth
Okay, this one's not really about Shor and QKD, but it is about the authenticated channel for QKD.
I said the physicists underplayed the importance of the authenticated channel in operational practice, and was (rightly) chastened by Dave Bacon and Daniel Gottesman. In the interests of equal time, here's proof that not everyone in the computer networking community gets it:
A thread on an Internet Draft (= proposed RFC) on using QKD for key exchange for PPP. Here's a link to the Internet Draft itself (warning, that link will probably go stale in April 2006, and may or may not properly track revisions to the draft, depending on whether or not I picked up the right link). The email thread seems a little fragmented, part of the conversation appears to be taking place elsewhere.
I haven't actually read the draft (the "view document" link at the IETF site appears broken?!?), but from the email thread, it appears that Mr. Sfaxi either does not understand or cannot explain the use of the classical authenticated channel. Remember, gang: QKD requires access to an authenticated classical channel.
Stay tuned for more late-breaking news from the exciting world of IETF...
[I've you've just come into this conversation, I recommend reading this posting, this posting, and especially quant-ph/0406147.]
I said the physicists underplayed the importance of the authenticated channel in operational practice, and was (rightly) chastened by Dave Bacon and Daniel Gottesman. In the interests of equal time, here's proof that not everyone in the computer networking community gets it:
A thread on an Internet Draft (= proposed RFC) on using QKD for key exchange for PPP. Here's a link to the Internet Draft itself (warning, that link will probably go stale in April 2006, and may or may not properly track revisions to the draft, depending on whether or not I picked up the right link). The email thread seems a little fragmented, part of the conversation appears to be taking place elsewhere.
I haven't actually read the draft (the "view document" link at the IETF site appears broken?!?), but from the email thread, it appears that Mr. Sfaxi either does not understand or cannot explain the use of the classical authenticated channel. Remember, gang: QKD requires access to an authenticated classical channel.
Stay tuned for more late-breaking news from the exciting world of IETF...
[I've you've just come into this conversation, I recommend reading this posting, this posting, and especially quant-ph/0406147.]
Thursday, October 13, 2005
Quantum Time Machines: What? Why? and (maybe) How?
Tim Ralph is in Tokyo this week, and is giving a talk today. Here's
the abstract:
Title: Quantum Time Machines: What? Why? and (maybe) How?
Abstract:
Whether time travel into the past is possible is an undecided physical
question [1]. Recently it has been noted that certain models of time
travel for quantum particles do not lead to the same difficult
paradoxes that arise for classical particles [2]. Furthermore the
types of quantum evolutions predicted for these "quantum time
machines" could give rise to a "super" quantum computer, able to solve
problems thought to be intractable by any other means [3]. In this
talk I will discuss time machines in general, how quantum mechanics
avoids the paradoxes and the unusual evolutions predicted. I will
then argue that the requirements for realizing such machines are not
as stringent as previously thought and I will propose "horizon
technology" experiments which could test these ideas [4].
With an abstract like that, you'd have to beat with me a stick to keep me away from the talk. Now, I only know a little about relativity - what every Caltech freshman knows (special relativity) and any fan of popular science writing knows (e.g., Kip Thorne's fabulous Black Holes and Time Warps). I know squat about the real math of general relativity, but I'm familiar with the basic implications of the curvature of space, black holes, wormholes, etc. So, should be a fun but challenging talk.
To prepare, I spent a little time this morning reading Tim's paper (quant-ph/0510038) and took a quick look at Dave Bacon's recent paper. Dave's paper is apparently a pretty major one. Building on work by Deutsch, Brun, and others, he shows that the existence of closed timelike curves (CTCs) would allow NP-complete problems to be solved on a quantum computer using only polynomial resources.
What the heck is a closed timelike curve? It's a path through a wormhole that comes back to where you started. Special relativity treats space and time as a four-dimensional space called spacetime, and establishes rules for paths in that space. You can never reach another point in spacetime that would require you to travel faster than light to get there. General relativity, which introduces the curvature of space, in its extreme appears to allow the creation of wormholes - places in space that, if you step through, take you to another distant point in space. Travelling through a wormhole can allow you to travel into your own past, coming back to where you started, and creating a "closed timelike curve". So Dave showed that travelling through a wormhole will allow you to solve NP-complete problems efficiently on a quantum computer. (We won't talk about grandfather paradoxes and the like, but apparently quantum mechanics helps there.)
Moving on, Tim's paper is lucid and remarkably easy to read for a non-specialist like me. He shows that you can take an entangled pair of quanta (an EPR pair), relativistically accelerate one and bring it back to the starting point, and like the pedagogical twins, one particle will now be "older" than the other. Using the standard quantum techniques of teleportation, he then creates a state that's something a little like a wormhole, essentially a loop in time. You can't use this to violate causality or learn about the future, but it does allow nonlinear evolution of the quantum system. Dave's system depends on the nonlinearity created by the wormhole; here Tim has done it without one. Fascinating.
Okay, time to pack up and head to the seminar and see whether I've correctly interpreted Tim, and what other features of this there are that I haven't thought of...
(From this point, it's just the notes I typed during the seminar, so apologies if they seem a little abrupt or semi-coherent.)
Morris, Thorne, Yurtsever, PRL 61
Where are all the time tourists? There's a plaque in Perth that says, "In the event that the transportation of life from the future" becomes possible, please show up at March 31, 2005, in Perth. No one did. (Reminds me of the MIT conference.) Requiring that a time machine receiver be built first eliminates the problem of the lack of visible travellers, but still doesn't eliminate the standard causality problems.
Tim actually likes Sagan's Contact. Wormholes were developed by Einstein and Rosen, so they are also called Einstein-Rosen bridges, but in their solution the wormhole collapses too quickly to be used. Kip Thorne showed that if you thread a wormhole with exotic matter, it might stay open long enough to let regular matter pass through. Thorne showed that many of the possible paradoxes resolve themselves, but whether they all resolve comfortably is unknown.
You get a nice little paradox if you use classical bits and a CNOT gate. Start with a one on the target bit, run it through the CNOT gate, and take that output rho, shove it through your time machine into the past, then use that as the control line for the CNOT gate. There's no possible consistent result. (Think about the states where rho comes out of the time machine as a one or a zero.)
However, if you do this with qubits, there is a solution. rho = 1/2(|0><0| + |1><1|) works. Deutsch then generalized and showed that any unitary evolution U has a solution without creating classical paradoxes, provided that you keep your particle that goes through the wormhole in a box and neither measure nor prepare it during the time travel interval.
The wormhole introduces a nonlinearity. Non-relativistic ("classical") quantum mechanics, of course, is linear. So is there a way to use that nonlinearity now added? Yes, in certain regions, it creates a faster growth of the trace distance between two states. This is where the power comes from that Bacon used to solve NP-complete problems.
Of course, we don't have a source of wormholes to experiment with, and don't even know how to build them. So what about a stochastic time machine? One that lights up one of four possible lights (labeled I, X, Z, and XZ) every time you shove a particle into to send it into the past. If you're familiar with quantum teleportation, you'll immediately recognize those as the operators you need to apply to retrieve the teleported state, once you've measured the qubit at the source.
During the time travel interval (between the past time where you receive the particle, and the future time where you sent it), what you have is a mixed state. Now there are no paradoxes, even if you observe the particle. And you can compute with it. But your real, true final answer isn't known until after the (local) time that you sent the particle into the time machine.
Teleportation can be considered a "stochastic wormhole". Do the same thing you would with a real wormhole - accelerate one of the particles at relativistic speeds for a while, then bring it back where you started - and you have a stochastic time machine.
What does it all mean? It seems that quantum systems are better behaved with respect to time travel than classical machines. A stochastic quantum time machine still leads to interesting and potentially useful quantum evolutions whilst avoiding ad hoc assumptions. Teleportation using time displaced entanglement appears equivalent to a stochastic time machine. These experiments are hard, but not too hard.
Question:What does increasing the overlap do to security proofs of QKD and things of that nature? Answer:Don't know. Would be interesting to check...
Question:Increasing the overlap, doesn't that have implications for cloning and superluminal communication? Answer:You essentially "lose contact" with other qubits (you have to trace out over other qubits, because the operation is on the reduced density operator). So, no, you don't get superluminal. (See Deutsch.)
It startles and intrigues me no end that quantum mechanics, information theory, and general relativity converge in such interesting ways.
See this auction!
the abstract:
Speaker: Prof.Tim C. Ralph
Centre for Quantum Computer Technology,
University of Queensland,
Australia.
Title: Quantum Time Machines: What? Why? and (maybe) How?
Abstract:
Whether time travel into the past is possible is an undecided physical
question [1]. Recently it has been noted that certain models of time
travel for quantum particles do not lead to the same difficult
paradoxes that arise for classical particles [2]. Furthermore the
types of quantum evolutions predicted for these "quantum time
machines" could give rise to a "super" quantum computer, able to solve
problems thought to be intractable by any other means [3]. In this
talk I will discuss time machines in general, how quantum mechanics
avoids the paradoxes and the unusual evolutions predicted. I will
then argue that the requirements for realizing such machines are not
as stringent as previously thought and I will propose "horizon
technology" experiments which could test these ideas [4].
[1] M.S.Morris, K.P.Thorne and U.Yurtsever, Phys.Rev.Lett.61, 1446 (1988).
[2] D.Deutsch, Phys.Rev.D 44, 3197 (1991).
[3] D.Bacon, Phys.Rev.A, 70, 032309 (2004).
[4] T.C.Ralph, quant-ph/0510038.
With an abstract like that, you'd have to beat with me a stick to keep me away from the talk. Now, I only know a little about relativity - what every Caltech freshman knows (special relativity) and any fan of popular science writing knows (e.g., Kip Thorne's fabulous Black Holes and Time Warps). I know squat about the real math of general relativity, but I'm familiar with the basic implications of the curvature of space, black holes, wormholes, etc. So, should be a fun but challenging talk.
To prepare, I spent a little time this morning reading Tim's paper (quant-ph/0510038) and took a quick look at Dave Bacon's recent paper. Dave's paper is apparently a pretty major one. Building on work by Deutsch, Brun, and others, he shows that the existence of closed timelike curves (CTCs) would allow NP-complete problems to be solved on a quantum computer using only polynomial resources.
What the heck is a closed timelike curve? It's a path through a wormhole that comes back to where you started. Special relativity treats space and time as a four-dimensional space called spacetime, and establishes rules for paths in that space. You can never reach another point in spacetime that would require you to travel faster than light to get there. General relativity, which introduces the curvature of space, in its extreme appears to allow the creation of wormholes - places in space that, if you step through, take you to another distant point in space. Travelling through a wormhole can allow you to travel into your own past, coming back to where you started, and creating a "closed timelike curve". So Dave showed that travelling through a wormhole will allow you to solve NP-complete problems efficiently on a quantum computer. (We won't talk about grandfather paradoxes and the like, but apparently quantum mechanics helps there.)
Moving on, Tim's paper is lucid and remarkably easy to read for a non-specialist like me. He shows that you can take an entangled pair of quanta (an EPR pair), relativistically accelerate one and bring it back to the starting point, and like the pedagogical twins, one particle will now be "older" than the other. Using the standard quantum techniques of teleportation, he then creates a state that's something a little like a wormhole, essentially a loop in time. You can't use this to violate causality or learn about the future, but it does allow nonlinear evolution of the quantum system. Dave's system depends on the nonlinearity created by the wormhole; here Tim has done it without one. Fascinating.
Okay, time to pack up and head to the seminar and see whether I've correctly interpreted Tim, and what other features of this there are that I haven't thought of...
(From this point, it's just the notes I typed during the seminar, so apologies if they seem a little abrupt or semi-coherent.)
Morris, Thorne, Yurtsever, PRL 61
Where are all the time tourists? There's a plaque in Perth that says, "In the event that the transportation of life from the future" becomes possible, please show up at March 31, 2005, in Perth. No one did. (Reminds me of the MIT conference.) Requiring that a time machine receiver be built first eliminates the problem of the lack of visible travellers, but still doesn't eliminate the standard causality problems.
Tim actually likes Sagan's Contact. Wormholes were developed by Einstein and Rosen, so they are also called Einstein-Rosen bridges, but in their solution the wormhole collapses too quickly to be used. Kip Thorne showed that if you thread a wormhole with exotic matter, it might stay open long enough to let regular matter pass through. Thorne showed that many of the possible paradoxes resolve themselves, but whether they all resolve comfortably is unknown.
You get a nice little paradox if you use classical bits and a CNOT gate. Start with a one on the target bit, run it through the CNOT gate, and take that output rho, shove it through your time machine into the past, then use that as the control line for the CNOT gate. There's no possible consistent result. (Think about the states where rho comes out of the time machine as a one or a zero.)
However, if you do this with qubits, there is a solution. rho = 1/2(|0><0| + |1><1|) works. Deutsch then generalized and showed that any unitary evolution U has a solution without creating classical paradoxes, provided that you keep your particle that goes through the wormhole in a box and neither measure nor prepare it during the time travel interval.
The wormhole introduces a nonlinearity. Non-relativistic ("classical") quantum mechanics, of course, is linear. So is there a way to use that nonlinearity now added? Yes, in certain regions, it creates a faster growth of the trace distance between two states. This is where the power comes from that Bacon used to solve NP-complete problems.
Of course, we don't have a source of wormholes to experiment with, and don't even know how to build them. So what about a stochastic time machine? One that lights up one of four possible lights (labeled I, X, Z, and XZ) every time you shove a particle into to send it into the past. If you're familiar with quantum teleportation, you'll immediately recognize those as the operators you need to apply to retrieve the teleported state, once you've measured the qubit at the source.
During the time travel interval (between the past time where you receive the particle, and the future time where you sent it), what you have is a mixed state. Now there are no paradoxes, even if you observe the particle. And you can compute with it. But your real, true final answer isn't known until after the (local) time that you sent the particle into the time machine.
Teleportation can be considered a "stochastic wormhole". Do the same thing you would with a real wormhole - accelerate one of the particles at relativistic speeds for a while, then bring it back where you started - and you have a stochastic time machine.
What does it all mean? It seems that quantum systems are better behaved with respect to time travel than classical machines. A stochastic quantum time machine still leads to interesting and potentially useful quantum evolutions whilst avoiding ad hoc assumptions. Teleportation using time displaced entanglement appears equivalent to a stochastic time machine. These experiments are hard, but not too hard.
Question:What does increasing the overlap do to security proofs of QKD and things of that nature? Answer:Don't know. Would be interesting to check...
Question:Increasing the overlap, doesn't that have implications for cloning and superluminal communication? Answer:You essentially "lose contact" with other qubits (you have to trace out over other qubits, because the operation is on the reduced density operator). So, no, you don't get superluminal. (See Deutsch.)
It startles and intrigues me no end that quantum mechanics, information theory, and general relativity converge in such interesting ways.
See this auction!
Friday, October 07, 2005
My First Perpetual-Motion Machine
A totally random anecdote...last night I recalled building my first perpetual motion machine. I don't have a clear idea of how old I was, but old enough to have my electronics science kit, including a small solar cell. Ten, perhaps?
My dad is big on O-gauge model trains, and they were set up in the basement. There were some lampposts with grain-of-wheat lightbulbs to go with it. The lampposts were intended to be battery-driven, but I had a better idea: I'd run them with my solar cell. I realized that they would need to be "primed"; I'd have to start with the battery hooked up, but then assuming all went well, I could unplug the battery and the solar cell would take over. Light from the lampposts themselves would drive the solar cell, which would drive the lampposts. Viola! Lighting for the trains, without a battery!
Needless to say, every time I unplugged the battery, the lampposts went out. I was smart enough to figure out that the solar cell needed more light, so I tried adding reflectors that would send back a greater percentage of the light to the solar cell. No joy. I think (though this part of the recollection is fuzzy) that I wound up trying a less ambitious system, with just the lightbulb directly in front of the solar cell, which of course failed, too.
And there this particular memory vignette fades to black. I have no particular recollection of realizing that what I was trying was theoretically impossible, only that the practical considerations were insurmountable. There were, in all probability, mechanical perpetual motion machines from about the same time frame. Something for me to try to dig out of that unreliable and fast-fading bank...
My dad is big on O-gauge model trains, and they were set up in the basement. There were some lampposts with grain-of-wheat lightbulbs to go with it. The lampposts were intended to be battery-driven, but I had a better idea: I'd run them with my solar cell. I realized that they would need to be "primed"; I'd have to start with the battery hooked up, but then assuming all went well, I could unplug the battery and the solar cell would take over. Light from the lampposts themselves would drive the solar cell, which would drive the lampposts. Viola! Lighting for the trains, without a battery!
Needless to say, every time I unplugged the battery, the lampposts went out. I was smart enough to figure out that the solar cell needed more light, so I tried adding reflectors that would send back a greater percentage of the light to the solar cell. No joy. I think (though this part of the recollection is fuzzy) that I wound up trying a less ambitious system, with just the lightbulb directly in front of the solar cell, which of course failed, too.
And there this particular memory vignette fades to black. I have no particular recollection of realizing that what I was trying was theoretically impossible, only that the practical considerations were insurmountable. There were, in all probability, mechanical perpetual motion machines from about the same time frame. Something for me to try to dig out of that unreliable and fast-fading bank...
Tuesday, October 04, 2005
100 Greatest Americans?
Rod & Ted's Alternate List of Greatest Americans
After reading the Discovery Channel's list of one hundred candidates for the Greatest American, Ted and I spent way too much time on creating our own version. We wound up throwing out more than half of the original list.
There was significant discussion over whether athletes and entertainers belonged on the list at all. They were certainly FAR over-represented in the original list. Probably none would be serious candidates for the single greatest American -- but the list of true candidates for that honor is very short, anyway. We settled on keeping representatives from almost the whole range of human endeavor. Most of the athletes, artists, musicians, and film actors/directors we kept were truly transformative, in our opinion. Feel free to argue!
Fields that we felt were egregiously underrepresented in the original list were writers, artists, jurists, scientists and engineers. In our current list, it's possible that business types are underrepresented. We wanted an educator, but couldn't come up with one.
In a few cases, we invoked executive privilege and listed two or three people as a single entry, when their names are rarely mentioned apart (or, in the case of Apollo 11, should be).
In a *very* few cases, the person in question is a putz. But, we felt their accomplishments were difficult to ignore, and in at least two cases, their putziness came into evidence later in life, after their accomplishments had already established their importance.
For your reading and arguing pleasure.
Keep (44):
New (51):
* = putz, but...
Our breakdown for the whole 100, by profession (totals exceed 100 because our list is actually slightly over 100, and some have two professions):
Eliminate (56):
Honorable Mentions:
After reading the Discovery Channel's list of one hundred candidates for the Greatest American, Ted and I spent way too much time on creating our own version. We wound up throwing out more than half of the original list.
There was significant discussion over whether athletes and entertainers belonged on the list at all. They were certainly FAR over-represented in the original list. Probably none would be serious candidates for the single greatest American -- but the list of true candidates for that honor is very short, anyway. We settled on keeping representatives from almost the whole range of human endeavor. Most of the athletes, artists, musicians, and film actors/directors we kept were truly transformative, in our opinion. Feel free to argue!
Fields that we felt were egregiously underrepresented in the original list were writers, artists, jurists, scientists and engineers. In our current list, it's possible that business types are underrepresented. We wanted an educator, but couldn't come up with one.
In a few cases, we invoked executive privilege and listed two or three people as a single entry, when their names are rarely mentioned apart (or, in the case of Apollo 11, should be).
In a *very* few cases, the person in question is a putz. But, we felt their accomplishments were difficult to ignore, and in at least two cases, their putziness came into evidence later in life, after their accomplishments had already established their importance.
For your reading and arguing pleasure.
Keep (44):
- Ali, Muhammad (a)
- Angelou, Maya (w)
- Anthony, Susan B. (l)
- Bell, Alexander Graham (i)
- Carnegie, Andrew (b)
- Carson, Johnny (f)
- Carver, George Washington (i)
- Chavez, Cesar (l)
- Disney, Walt (f,v,b)
- Douglass, Frederick (l)
- Edison, Thomas Alva (i)
- Einstein, Albert (i)
- Ford, Henry (b)
- Franklin, Benjamin (i,t)
- Gates, Bill (b)
- Glenn, John (e,t)
- Graham, Billy (r)
- Hamilton, Alexander (t)
- Jefferson, Thomas (pr,i)
- Keller, Helen (l)
- Kennedy, John F. (p)
- King Jr., Dr. Martin Luther (l)
- Lincoln, Abraham (p)
- Lindbergh, Charles (e)
- Malcolm X (l)
- Murphy, Audie (s)
- Owens, Jesse (a)
- Patton, George (s)
- Presley, Elvis (m)
- Roebling, John A. ?
- Robinson, Jackie (a)
- Roosevelt, Franklin D (p)
- Roosevelt, Theodore (p)
- Ruth, Babe (a)
- Salk, Jonas (i)
- Smith, Joseph (r)
- Tesla, Nikola (i)
- Tubman, Harriet (l)
- Twain, Mark (w)
- Washington, George (s,p)
- Wayne, John (f)
- Wright, Orville & Wilbur (i)
- Yeager, Chuck (e)
New (51):
- Ansel Adams (v)
- Apollo 11 crew (Neil Armstrong, Buzz Aldrin, & Michael Collins) (e)
- Louis Armstrong (m)
- Daniel Boone (e)
- Ray Bradbury (w)
- William Jennings Bryan (j)
- Alexander Calder (v)
- Johnny Cash (m)
- Vint Cerf (i)
- Edward Drinker Cope & Othniel Charles Marsh (i)
- Aaron Copland (m)
- Davy Crockett (e)
- Walter Cronkite (f)
- Glenn Curtiss (i,b)
- Virginia Dare (o)
- Clarence Darrow (j)
- Charles & Ray Eames (v)
- Richard Feynman (i)
- Bobby Fischer * (o)
- F. Scott Fitzgerald (w)
- Robert Fulton (i)
- Theodore Giesel (w)
- William Randolph Hearst (b)
- Ernest Hemingway (w)
- Sam Houston (e,s,t)
- Edwin Hubble (i)
- Andrew Jackson (s,p)
- Francis Scott Key (w)
- Meriwether Lewis & William Clark (e)
- Douglas MacArthur (s)
- George C. Marshall (po)
- John Marshall (j)
- Thurgood Marshall (j)
- Herman Melville (w)
- Gordon Moore (i,b)
- John Muir (e)
- James Naismith (a)
- John von Neumann (i)
- Sandra Day O'Connor (j)
- Robert Oppenheimer (i)
- Linus Pauling (i)
- Edgar Allen Poe (w)
- Sacajawea (e)
- Charles Schulz (v)
- William Shockley (i) *
- Siegel & Schuster (v)
- Henry David Thoreau (w)
- Andy Warhol (v)
- James Watson (i)
- Orson Welles (f)
- Walt Whitman (w)
- Frank Lloyd Wright (v)
* = putz, but...
Our breakdown for the whole 100, by profession (totals exceed 100 because our list is actually slightly over 100, and some have two professions):
p: presidents: 7
s: soldiers: 6
j: jurists/lawyers: 5
t: other politicians/statesmen: 5
l: labor/rights/activists: 7
e: adventurers/explorers/astronauts/pioneers: 13
i: scientists/inventors: 22
v: visual arts/design/architecture: 9
w: writers: 11
f: film/tv: 4
m: musicians: 4
a: athletes: 5
b: business: 5
r: religion: 2
o: other: 2
Eliminate (56):
- Armstrong, Lance
- Armstrong, Neil
- Ball, Lucille
- Bush, Barbara
- Bush, George H. W.
- Bush, George W.
- Bush, Laura
- Carter, Jimmy
- Charles, Ray
- Clinton, Bill
- Clinton, Hillary
- Cosby, Bill
- Cruise, Tom
- DeGeneres, Ellen
- Earhart, Amelia
- Eastwood, Clint
- Edwards, John
- Eisenhower, Dwight (s,pr)
- Favre, Brett
- Gibson, Mel
- Giuliani, Rudolph
- Hanks, Tom
- Hefner, Hugh
- Hepburn, Katharine
- Hope, Bob
- Hughes, Howard
- Jackson, Michael
- Jobs, Steve
- Johnson, Lyndon B.
- Jordan, Michael
- Kennedy, Robert F.
- Kennedy Onassis, Jacqueline
- Limbaugh, Rush
- Lucas, George
- Madonna
- McGraw, Dr. Phil
- Monroe, Marilyn
- Moore, Michael
- Nixon, Richard
- Obama, Barack
- Parks, Rosa
- Powell, Colin
- Reagan, Ronald
- Reeve, Christopher
- Rice, Condoleezza
- Roosevelt, Eleanor
- Sagan, Carl
- Schwarzenegger, Arnold
- Sinatra, Frank
- Spielberg, Steven
- Stewart, Jimmy
- Stewart, Martha
- Tillman, Pat
- Truman, Harry
- Trump, Donald
- Walton, Sam
- Winfrey, Oprah
- Woods, Tiger
Honorable Mentions:
- Isaac Asimov
- Carl Barks
- Vannevar Bush
- John Coltrane
- Charles Curtis
- Miles Davis
- Phillip K. Dick
- Elizabeth Dole
- Duke Ellington (m)
- Philo T. Farnsworth
- William Faulkner
- Geraldine Ferraroo
- Hal Foster
- Hugo Gernsback
- Samuel Gompers
- U.S. Grant
- Devil Anse Hatfield
- Nathaniel Hawthorne
- Robert Heinlein
- Frank Herbert
- Billie Holliday
- Harry Houdini
- Mother Jones
- Bob Kahn
- Bob Kane
- Chief Logan
- Mercury Seven, the rest
- Nancy Landon Kassebaum
- Walt Kelly
- Chief Logan
- Wilma Mankiller
- Winsor McKay
- Charlier Parker
- Frances Perkins
- Jon Postel
- James Randi
- Norman Rockwell
- Nellie Tayloe Ross
- Chief Seattle
- Claude Shannon
- Upton Sinclair
- Dr. Spock
- Jim Thorpe
- Gore Vidal
- Booker T. Washington
- Laura Ingalls Wilder
- Woodrow Wilson
- Woodward & Bernstein
- Daniel Carter Beard & William D. Boyce
Monday, October 03, 2005
Stop the Myth: QKD doesn't fix what Shor broke
[Update: See More on the Myth.]
There is a lot of confusion about the relationship between two of the most famous results in quantum information theory: Shor's algorithm for factoring large numbers on a quantum computer and quantum key distribution (QKD) (developed originally by Charles Bennett and Giles Brassard in 1984). Shor's algorithm, of course, has implications for public-key cryptography.
It's important to note that a) QKD does NOT solve what Shor's factoring algorithm broke, and b) key exchange/distribution is not the biggest security problem we have on the net (it might not even make the top ten). The primary use of public-key cryptosystems is distributed authentication. The primary use of QKD will be key generation/exchange. They serve very different roles in the Internet. A secure Internet communication consists of several distinct parts - authentication of the parties involved, generation of a secure session key, and then the data encryption itself.
Your Internet Explorer, Mozilla, or Firefox comes with Verisign's public key built in. This allows you to check that yes, indeed, Verisign signed the key that says it came from Amazon.com, and therefore you can believe that you are truly connected to Amazon. This is what we call authentication.
After this authentication step, you must create a shared, secret key to continue the communication. That key exchange is often done via Diffie-Hellman key exchange, which allows parts of the communication to be done in the open (unencrypted). D-H by itself is vulnerable to a man-in-the-middle attack, so it is used with some form of authentication to create a shared, authenticated secret key for your session.
The difficulty of cracking a Diffie-Hellman exchange is, I believe, founded on the difficulty of the discrete logarithm problem, and hence is vulnerable to Shor. HOWEVER, D-H is not the only classical method of doing key exchange. If you already have a shared secret, it is easy to negotiate a key without D-H. I used to work for a startup that did IPSEC gateways (though I am FAR FAR from an expert on this), and I believe most of our customers did their authentication via pre-shared secret key. Though the IKE (Internet Key Exchange) built on that secret key authentication does use D-H, that can be fixed (by modifying the RFCs); when you have a pre-shared secret, you can easily use it to encrypt a new key chosen by one of the parties.
So Shor breaks public-key (PK) crypto. What are the implications? Without PK, our key management headaches get worse, but there is (almost?) nothing we can do with public key we can't do with private key. The scalability of authentication servers is problematic and the number of keys you keep around on your systems grows, but in practice probably not nearly as much as people fear, since most people really only connect securely to a few places, anyway.
Okay, so can QKD take over the role filled by PK crypto? After all, the physicists talk about it being impossible to break because of the physics. Nope, and for a simple reason: QKD still requires a classically-authenticated channel; that is, it still requires the functionality that Shor broke! It doesn't fix Shor. But, if you have authentication you can trust today, regardless of whether it is PK or not, you're set; the authentication information is not used in the generation of the key used to encrypt the data.
I sat in on a talk on QKD by Todd Brun at ISI's div7, in front of thirty or forty of the planet's best networking researchers. They immediately started jumping up and down and stamping their feet about man-in-the-middle attacks and the need for the authenticated channel. Todd remarked something to the effect that the physicists, if they are aware of the problem at all, are much less cognizant of its operational implications. Paterson et al. wrote a nice paper on the implications of QKD, where much of what I've written here is discussed, and probably more coherently.
QKD's place is for the truly, truly paranoid: people who believe that their traffic is being snooped today, and that classical, symmetric crypto (3DES, AES) will fall to either mathematical advances or some further quantum computing advance sometime in the distant future, and don't want their traffic from TODAY read by somebody half a century from now. Authentication that is good enough TODAY, plus QKD, is good enough to guarantee that your data is safe for all time (modulo breakins/breakdowns at the end points).
In that case, your only operational choice is to use the QKD-generated key as a one-time pad, and discard; if you're really paranoid enough to want the QKD, don't use it to just generate keys for use with a symmetric crypto protocol. That means you need very high QKD bit-generation rates, and most are still in the kilobits/second. Some experiments have been done in the low megabits/sec., but that's pre-filtering, I believe, which costs you at least one order of magnitude in performance. I was at a talk by Tomita-san, one of NEC's QKD experts, just a couple of days ago, and he was talking about an experimental setup over 16km of real-world fiber on telephone poles that did QKD at an average rate of 13 kilobits/second. The most advanced engineering demonstration is the BBN/Harvard/Boston U. network, see some of Chip Elliott's papers.
I arrived at the party a little late to get in on the recent thread at Dave Bacon's Quantum Pontiff blog, but I did throw in my two cents anyway, see the comments here. There was also some discussion on Dave Farber's IP list, see here and search for quantum or Armstrong.
There is a lot of confusion about the relationship between two of the most famous results in quantum information theory: Shor's algorithm for factoring large numbers on a quantum computer and quantum key distribution (QKD) (developed originally by Charles Bennett and Giles Brassard in 1984). Shor's algorithm, of course, has implications for public-key cryptography.
It's important to note that a) QKD does NOT solve what Shor's factoring algorithm broke, and b) key exchange/distribution is not the biggest security problem we have on the net (it might not even make the top ten). The primary use of public-key cryptosystems is distributed authentication. The primary use of QKD will be key generation/exchange. They serve very different roles in the Internet. A secure Internet communication consists of several distinct parts - authentication of the parties involved, generation of a secure session key, and then the data encryption itself.
Your Internet Explorer, Mozilla, or Firefox comes with Verisign's public key built in. This allows you to check that yes, indeed, Verisign signed the key that says it came from Amazon.com, and therefore you can believe that you are truly connected to Amazon. This is what we call authentication.
After this authentication step, you must create a shared, secret key to continue the communication. That key exchange is often done via Diffie-Hellman key exchange, which allows parts of the communication to be done in the open (unencrypted). D-H by itself is vulnerable to a man-in-the-middle attack, so it is used with some form of authentication to create a shared, authenticated secret key for your session.
The difficulty of cracking a Diffie-Hellman exchange is, I believe, founded on the difficulty of the discrete logarithm problem, and hence is vulnerable to Shor. HOWEVER, D-H is not the only classical method of doing key exchange. If you already have a shared secret, it is easy to negotiate a key without D-H. I used to work for a startup that did IPSEC gateways (though I am FAR FAR from an expert on this), and I believe most of our customers did their authentication via pre-shared secret key. Though the IKE (Internet Key Exchange) built on that secret key authentication does use D-H, that can be fixed (by modifying the RFCs); when you have a pre-shared secret, you can easily use it to encrypt a new key chosen by one of the parties.
So Shor breaks public-key (PK) crypto. What are the implications? Without PK, our key management headaches get worse, but there is (almost?) nothing we can do with public key we can't do with private key. The scalability of authentication servers is problematic and the number of keys you keep around on your systems grows, but in practice probably not nearly as much as people fear, since most people really only connect securely to a few places, anyway.
Okay, so can QKD take over the role filled by PK crypto? After all, the physicists talk about it being impossible to break because of the physics. Nope, and for a simple reason: QKD still requires a classically-authenticated channel; that is, it still requires the functionality that Shor broke! It doesn't fix Shor. But, if you have authentication you can trust today, regardless of whether it is PK or not, you're set; the authentication information is not used in the generation of the key used to encrypt the data.
I sat in on a talk on QKD by Todd Brun at ISI's div7, in front of thirty or forty of the planet's best networking researchers. They immediately started jumping up and down and stamping their feet about man-in-the-middle attacks and the need for the authenticated channel. Todd remarked something to the effect that the physicists, if they are aware of the problem at all, are much less cognizant of its operational implications. Paterson et al. wrote a nice paper on the implications of QKD, where much of what I've written here is discussed, and probably more coherently.
QKD's place is for the truly, truly paranoid: people who believe that their traffic is being snooped today, and that classical, symmetric crypto (3DES, AES) will fall to either mathematical advances or some further quantum computing advance sometime in the distant future, and don't want their traffic from TODAY read by somebody half a century from now. Authentication that is good enough TODAY, plus QKD, is good enough to guarantee that your data is safe for all time (modulo breakins/breakdowns at the end points).
In that case, your only operational choice is to use the QKD-generated key as a one-time pad, and discard; if you're really paranoid enough to want the QKD, don't use it to just generate keys for use with a symmetric crypto protocol. That means you need very high QKD bit-generation rates, and most are still in the kilobits/second. Some experiments have been done in the low megabits/sec., but that's pre-filtering, I believe, which costs you at least one order of magnitude in performance. I was at a talk by Tomita-san, one of NEC's QKD experts, just a couple of days ago, and he was talking about an experimental setup over 16km of real-world fiber on telephone poles that did QKD at an average rate of 13 kilobits/second. The most advanced engineering demonstration is the BBN/Harvard/Boston U. network, see some of Chip Elliott's papers.
I arrived at the party a little late to get in on the recent thread at Dave Bacon's Quantum Pontiff blog, but I did throw in my two cents anyway, see the comments here. There was also some discussion on Dave Farber's IP list, see here and search for quantum or Armstrong.