No, not the Caltech mascot...
This L.A. Times story says paleontologists have found a 164 million year-old mammal. It's aquatic, furry, had a beaver-like tail, and weighed 500-700 grams, which is a lot heavier that previously known Jurassic mammals. It's not directly related to the beaver, though it has some similarities, and would push back the history of aquatic mammals by a hundred million years or so.
Saturday, February 25, 2006
Distributed Arithmetic on a Quantum Multicomputer
My paper "Distributed Arithmetic on a Quantum Multicomputer" was accepted for ACM's International Symposium on Computer Architecture (ISCA). This is ACM's most prestigious and competitive architecture conference, and this year there were 31 acceptances out of 229 submissions.
See you in Boston in June!
See you in Boston in June!
Friday, February 24, 2006
Quantum Design Tools
Krysta Svore, Al Aho, Andrew Cross, Ike Chuang and Igor Markov published a paper titled A Layered Software Architecture for Quantum Computing Design Tools in the January issue of IEEE Computer.
The paper is a somewhat general summary of work this group has been doing for a couple of years. It covers their four-phase software architecture, compiling from high-level languages down to an optimized program for a particular circuit layout.
This is, in general, good news; we have definitely reached the point where we need serious, modular tools that can be adapted by various research groups to meet their particular experimental needs, without starting from scratch. And this is just the group of people to do it, and the article suggests that the tools will be open-source.
I have some tools of my own; I will investigate integrating what I've got with what they have accomplished...
The paper is a somewhat general summary of work this group has been doing for a couple of years. It covers their four-phase software architecture, compiling from high-level languages down to an optimized program for a particular circuit layout.
This is, in general, good news; we have definitely reached the point where we need serious, modular tools that can be adapted by various research groups to meet their particular experimental needs, without starting from scratch. And this is just the group of people to do it, and the article suggests that the tools will be open-source.
I have some tools of my own; I will investigate integrating what I've got with what they have accomplished...
ATP: Astronomical Toilet Paper
Today's Daily Yomiuri has an article about astronomical toilet paper, created by a group of astronomy graduate students, planetarium employees, and others. It features the life of a star, from proto-star through main sequence to red giant. The paper itself is only in Japanese, so far, but there is an English web page linked from the above.
Great quote from one of the creators: "In your toilet we'd like you to feel the vastness of the Universe and realize that men and the Earth exist as part of the universe."
Great quote from one of the creators: "In your toilet we'd like you to feel the vastness of the Universe and realize that men and the Earth exist as part of the universe."
ACM's Globalization Report
Yesterday we were talking about globalization, and by coincidence, today ACM has released its report on globalization. While the focus is on software, if you believe that globalization doesn't affect whatever your personal specialty is, you're mistaken.
Thursday, February 23, 2006
San Francisco's Japantown in Danger?
I just received (or, just read) mail saying that a big chunk of Japantown is for sale. Japantown has been through many ups and downs in its 100-year history, but it is currently only a small area, with only a few shops and businesses outside of the mall itself. It would be a real shame for Japantown to go away, but this is a long discussion involving the personal choices of many Nissei (Japanese-related Americans), history as far back as the forced relocations during WWII, the economic arc of various San Francisco neighborhoods, the one-time ghetto-ization of many American minorities, the current trend toward assimilation in much of the Japanese-American community, the current small number of Japanese immigrants (and short-term expats sent by the home office)...
There appear to be only three Japantowns left in the U.S.: San Francisco, San Jose, and L.A.'s Little Tokyo, plus undoubtedly a few other less-formal neighborhoods, such as what we called "Pico-Tokyo" along Sawtelle near Pico in west L.A.
There appear to be only three Japantowns left in the U.S.: San Francisco, San Jose, and L.A.'s Little Tokyo, plus undoubtedly a few other less-formal neighborhoods, such as what we called "Pico-Tokyo" along Sawtelle near Pico in west L.A.
GSM in Guam
We have experimental verification. GSM works on Guam, and roaming from NTT DoCoMo works properly. No GPRS (packet service), just basic voice service. And the GSM only works near the major tourist/population centers, not the remote south/east parts of the island.
The snorkeling, btw, was fantastic. We saw eels, triggerfish, wrasses, tangs, unicornfish, a few puffer-family fish, and some nice coral. Jealous yet :-)?
We met some guys from Wisconsin who were on a two-week diving trip to Truk and Palau. Man, if I did that and called home to Wisconsin where my wife was dealing with 30cm of snow and -19F (-28C) temperatures, when I got home the locks would be changed.
The snorkeling, btw, was fantastic. We saw eels, triggerfish, wrasses, tangs, unicornfish, a few puffer-family fish, and some nice coral. Jealous yet :-)?
We met some guys from Wisconsin who were on a two-week diving trip to Truk and Palau. Man, if I did that and called home to Wisconsin where my wife was dealing with 30cm of snow and -19F (-28C) temperatures, when I got home the locks would be changed.
NEC Shutting Irish Plant
Forbes and others are reporting that NEC Electronics has announced it will close its plant in Ireland in September, laying off 350 workers. The plant makes micro-controllers for automotive electronics. The work will be transfered to China, Malaysia, and Singapore, according to Kyodo. Some 150nm lithography equipment will also be moved from Sagamihara (Japan) to some plant in the US.
For the last decade or more, Ireland has been considered one of the EU's most attractive places to do high-tech manufacturing, due to a combination of wages, education, local infrastructure, and more. Now jobs are starting to leave because of high salaries? Tom Friedman would say it's globalization in action, and the "long-horn cattle" such as NEC move capital around more slowly than the "short-horn cattle" (day traders and other investors), but they do move it around in response to market forces. Did the strong euro hurt Ireland?
For the last decade or more, Ireland has been considered one of the EU's most attractive places to do high-tech manufacturing, due to a combination of wages, education, local infrastructure, and more. Now jobs are starting to leave because of high salaries? Tom Friedman would say it's globalization in action, and the "long-horn cattle" such as NEC move capital around more slowly than the "short-horn cattle" (day traders and other investors), but they do move it around in response to market forces. Did the strong euro hurt Ireland?
Saturday, February 18, 2006
Future of Classical Computing HW: Magnetic Quantum Cellular Automata
Geek Press points to a Wired article about some research on magnetic quantum dot cellular automata. This is based on a paper in Science by Imre, Porod and others from Notre Dame, and an associated perspective by Cowburn.
The research is fantastic. But I dislike Wired's characterization of it. Let's look at the science first. What they have done is used quantum effects to build classical logic; they are not using superposition or entanglement to run "quantum algorithms" like Shor's factoring algorithm. They have created nanometer-scale magnets which can be arranged so that they form a quantum cellular automaton (QCA). The magnets are a nickel/iron alloy, patterned using standard lithography on a silicon substrate. Set up properly, this QCA can perform a majority gate of three inputs. If two or more inputs are one, the output is zero, otherwise, it's one. This is a useful primitive, both directly (for e.g. calculating carry chains in adders) and because it trivially transforms to NAND or NOR. Similar work was done by Cowburn et al. a few years ago; the difference is that they used charge, whereas this work is spin (magnetism). The new work has two major advantages: it can be run at room temperature, and it can be non-volatile. They estimate that, running at 100MHz switching rate, 10^10 gates would dissipate 100milliwatts.
The Wired article described this as important for its nonvolatility (true), density (false), speed (false) and lack of wires (false). The technology has some things in common with FeRAM, and a good, low-power, dense, room-temperature, random-access, non-volatile memory will be a huge boon. But fast it is not, in its current form. 100MHz is not an especially quick switching speed, as Cowburn noted. As to density, as long as the structures are lithographically defined, they are not inherently dramatically better than normal chips (bits on magnetic tape or disk, in contrast, are much smaller than individual transistors).
More importantly, the lack of wires that Wired seems to like actually creates a problem: how do you get information from one part of a chip to another? Well, you have to create a chain of automata that form a switching channel, clocking data from place to place down the chain. This is going to be very slow, compared to standard electrical signal propagation in a wire. It may also be wider, I'm not sure, and space for wires is one of the biggest problems we have in chip design today.
But there are interesting possibilities in combining this with normal, charge-based logic. Something similar to FeRAM definitely has possibilities. One thing suggested is that you can put some processing with the data. This idea is not new; the Berkeley iRAM project has been pushing it for a decade or so. The success or failure of MQCA, in my opinion, does not hinge on the success or failure of iRAM, but might work well with it.
In summary: 1) this is fantastic work, and might change the way we build chips and ultimately systems; 2) Wired doesn't quite understand why it's fantastic; 3) I need to think about this more before I understand how to best take advantage of it; and 4) there are probably others with better ideas than me...
The research is fantastic. But I dislike Wired's characterization of it. Let's look at the science first. What they have done is used quantum effects to build classical logic; they are not using superposition or entanglement to run "quantum algorithms" like Shor's factoring algorithm. They have created nanometer-scale magnets which can be arranged so that they form a quantum cellular automaton (QCA). The magnets are a nickel/iron alloy, patterned using standard lithography on a silicon substrate. Set up properly, this QCA can perform a majority gate of three inputs. If two or more inputs are one, the output is zero, otherwise, it's one. This is a useful primitive, both directly (for e.g. calculating carry chains in adders) and because it trivially transforms to NAND or NOR. Similar work was done by Cowburn et al. a few years ago; the difference is that they used charge, whereas this work is spin (magnetism). The new work has two major advantages: it can be run at room temperature, and it can be non-volatile. They estimate that, running at 100MHz switching rate, 10^10 gates would dissipate 100milliwatts.
The Wired article described this as important for its nonvolatility (true), density (false), speed (false) and lack of wires (false). The technology has some things in common with FeRAM, and a good, low-power, dense, room-temperature, random-access, non-volatile memory will be a huge boon. But fast it is not, in its current form. 100MHz is not an especially quick switching speed, as Cowburn noted. As to density, as long as the structures are lithographically defined, they are not inherently dramatically better than normal chips (bits on magnetic tape or disk, in contrast, are much smaller than individual transistors).
More importantly, the lack of wires that Wired seems to like actually creates a problem: how do you get information from one part of a chip to another? Well, you have to create a chain of automata that form a switching channel, clocking data from place to place down the chain. This is going to be very slow, compared to standard electrical signal propagation in a wire. It may also be wider, I'm not sure, and space for wires is one of the biggest problems we have in chip design today.
But there are interesting possibilities in combining this with normal, charge-based logic. Something similar to FeRAM definitely has possibilities. One thing suggested is that you can put some processing with the data. This idea is not new; the Berkeley iRAM project has been pushing it for a decade or so. The success or failure of MQCA, in my opinion, does not hinge on the success or failure of iRAM, but might work well with it.
In summary: 1) this is fantastic work, and might change the way we build chips and ultimately systems; 2) Wired doesn't quite understand why it's fantastic; 3) I need to think about this more before I understand how to best take advantage of it; and 4) there are probably others with better ideas than me...
Experimental Verification: GSM in Guam?
According to GSM World, NTT DoCoMo has a roaming agreement with Guam Wireless for 1900MHz GSM service. This should allow my N900iG to work.
I'd never pass up the opportunity for experimental verification of something this important. The sacrifices I make in the name of science :-). I'll report back in a few days.
I'd never pass up the opportunity for experimental verification of something this important. The sacrifices I make in the name of science :-). I'll report back in a few days.
Keene's Chronicles
The Daily Yomiuri is publishing a series of short autobiographical essays by Donald Keene. Keene is a famous professor of Japanese studies at Columbia, where they now have the Donald Keene Center.
Keene first came to Japan right after the war, after having served as a translator. He lived here off and on, I think, while holding down duties at Columbia, for the next half century. He has written or translated dozens of books on Japan.
I have no idea how long DY will keep those essays up; they're bad about archiving things in a findable place. The online version doesn't include the nice illustrations. Perhaps this will be published as a book later.
In the course of the series, so far, he is a student at Columbia. I think the series will run all year.
Keene first came to Japan right after the war, after having served as a translator. He lived here off and on, I think, while holding down duties at Columbia, for the next half century. He has written or translated dozens of books on Japan.
I have no idea how long DY will keep those essays up; they're bad about archiving things in a findable place. The online version doesn't include the nice illustrations. Perhaps this will be published as a book later.
In the course of the series, so far, he is a student at Columbia. I think the series will run all year.
Friday, February 17, 2006
Electric Super-Cars: 290kph, 370kph!
Various news outlets are reporting that Hybrid Technologies will unveil a electric car capable of speeds of 290km/hour. A two-seater, carbon-fiber body, lithium battery-powered rocket that runs as fast as a shinkansen.
A professor Keio University (my university) and his team have created Eliica, an eight-wheeled electric limo capable of 370 kilometers/hour. Now that's fast!
A professor Keio University (my university) and his team have created Eliica, an eight-wheeled electric limo capable of 370 kilometers/hour. Now that's fast!
Tuesday, February 14, 2006
"Enjoy Your Failures": Akira Furusawa
Nine-twenty or so on Valentine's evening, and Mayumi is putting the girls to bed. I flip on the TV,looking for the Torino Olympics, and whose voice do I hear? Akira Furusawa's! In a minute or so, it becomes clear that I have stumbled into an hour-long show about Akira and his lab on NHK, the national network. It's part of a series on inspirational professionals.
Prof. Furusawa, of course, is one of the planet's leading experts on quantum optics and one of the experimentalists who first performed quantum teleportation. He's now a professor at Todai, the University of Tokyo.
What follows is the raw notes I typed during the last forty minutes of the show. I also have that part on tape. I'd like to have the whole thing.
Talking about enjoying failure. Science is like a sport. You challenge yourself, measure yourself against the best.
Saying "I'm busy," really, more than in body I'm busy in mind. I try to eat dinner with my family.
He even brought a teleportation setup into NHK's (stark black and white) studio. The announcer is amazed at him routing a laser through a half a dozen mirrors and lenses.
How much does this cost? Well, that piece is about three thousand dollars... total setup, could be millions? Yeah, runs into that range.
He said when he's working on a problem, he likes to go to sleep early. While he's asleep, (during REM sleep), he gets good ideas and wakes up with the solution.
Akira likes to ski (video of him on the slopes -- they went to a lot of effort to make this show)
He was a regular salaryman until 33. Then, seeing Nomo sign with the Dodgers, he decided he wanted a shot at the big leagues, too. He goes to Caltech, joins Jeff Kimble's lab, and immediately runs into a wall: his English isn't good enough. One frustrating day, he goes to Dodger Stadium, and sees Nomo hit his first home run in the majors. He comes back, thinking of sports, and takes up tennis with Kimble. Kimble says it helped their relationship.
"Enjoy your failures"
Talking about failures, didn't you ever fail? Well, really only walls you put in front of yourself. I did fail the driver's test three times.
How can you believe you're going to succeed when 99% of people fail? Everybody has the power to hit home runs, just not everybody believes in themselves.
Heck, they even went to his lab's bonenkai (year-end party).
Takahashi is experimenting on doing multiple teleportations at the same time (in a chain?) and is stuck. My students are actually doing outstanding work. Sometimes they don't think so, but compared to what we were doing eight years ago, this is great.
Home for dinner, nice looking family...
Another day, back to the lab. It's a month to a major international conference, and they're worried that they might not make it. Another try. Got it! Takahashi's face lights up. Akira at Narita Airport, "I'm looking forward to the conference, I think our results will surprise people."
"Being a professional means being able to enjoy what you do."
Ends with uplifting music.
Prof. Furusawa, of course, is one of the planet's leading experts on quantum optics and one of the experimentalists who first performed quantum teleportation. He's now a professor at Todai, the University of Tokyo.
What follows is the raw notes I typed during the last forty minutes of the show. I also have that part on tape. I'd like to have the whole thing.
Talking about enjoying failure. Science is like a sport. You challenge yourself, measure yourself against the best.
Saying "I'm busy," really, more than in body I'm busy in mind. I try to eat dinner with my family.
He even brought a teleportation setup into NHK's (stark black and white) studio. The announcer is amazed at him routing a laser through a half a dozen mirrors and lenses.
How much does this cost? Well, that piece is about three thousand dollars... total setup, could be millions? Yeah, runs into that range.
He said when he's working on a problem, he likes to go to sleep early. While he's asleep, (during REM sleep), he gets good ideas and wakes up with the solution.
Akira likes to ski (video of him on the slopes -- they went to a lot of effort to make this show)
He was a regular salaryman until 33. Then, seeing Nomo sign with the Dodgers, he decided he wanted a shot at the big leagues, too. He goes to Caltech, joins Jeff Kimble's lab, and immediately runs into a wall: his English isn't good enough. One frustrating day, he goes to Dodger Stadium, and sees Nomo hit his first home run in the majors. He comes back, thinking of sports, and takes up tennis with Kimble. Kimble says it helped their relationship.
"Enjoy your failures"
Talking about failures, didn't you ever fail? Well, really only walls you put in front of yourself. I did fail the driver's test three times.
How can you believe you're going to succeed when 99% of people fail? Everybody has the power to hit home runs, just not everybody believes in themselves.
Heck, they even went to his lab's bonenkai (year-end party).
Takahashi is experimenting on doing multiple teleportations at the same time (in a chain?) and is stuck. My students are actually doing outstanding work. Sometimes they don't think so, but compared to what we were doing eight years ago, this is great.
Home for dinner, nice looking family...
Another day, back to the lab. It's a month to a major international conference, and they're worried that they might not make it. Another try. Got it! Takahashi's face lights up. Akira at Narita Airport, "I'm looking forward to the conference, I think our results will surprise people."
"Being a professional means being able to enjoy what you do."
Ends with uplifting music.
45,000 Photographs of the California Coastline
In the process of avoiding work, I decided to drop in on www.californiacoastline.org, a project run by my friends Ken and Gabrielle Adelman. They have been taking photos of the California coastline for several years now, from a low-flying helicopter, documenting changes on the coast, both natural and man-made. They are nearing their 30,000th photo, and have added several photo databases extending back to 1972. They're most famous for having been sued by Barbra Streisand (they won), but the point is how human activity affects the coast, not who the people are and how fancy the houses are. Not that I contributed anything to the project, but I got to tag along on one short flight. Interesting work.
I would love to see a complementary set of ground-level photos of this, or any other, area (say, the town of Half Moon Bay), that resulted in a 3-D dataset showing how the town has evolved over the last century.
I would love to see a complementary set of ground-level photos of this, or any other, area (say, the town of Half Moon Bay), that resulted in a 3-D dataset showing how the town has evolved over the last century.
Monday, February 13, 2006
Farber Challenges Young Japanese Researchers
Dave Farber has just issued a challenge to the young Internet researchers in Japan, as part of his participation in the Mitou project.
The money quote: "There is a lot of pressure in the commercial world to turn the Internet into a television set... But we could instead produce a major change in the way people work, the way people live and the way people interact. Where we end up is still very much unknown..."
The money quote: "There is a lot of pressure in the commercial world to turn the Internet into a television set... But we could instead produce a major change in the way people work, the way people live and the way people interact. Where we end up is still very much unknown..."
Friday, February 10, 2006
Lego Difference Engine
Andrew Carol has built a Lego version of Babbage's Difference Engine. Found via Geek Press.
Of course, afficianados of such things know that Danny Hillis and friends created a tic-tac-toe playing computer from Tinkertoys when Danny was an undergrad at MIT.
Of course, afficianados of such things know that Danny Hillis and friends created a tic-tac-toe playing computer from Tinkertoys when Danny was an undergrad at MIT.
Imperial Intrigue
Congratulations are due to Japan's Princess Kiko, who, it was announced on Wednesday, is pregnant with her third child.
I'm sure she's happy, but must be dreading the political firestorm approaching.
What firestorm? Oh, uh, the imperial succession. You see, as the press here likes to phrase it, "No male heir to the throne has been born in forty years." In Japan, under current law, only men are allowed to sit on the throne. The current emperor has two sons, Crown Prince Naruhito (called "Kotaishi-sama" here; you never hear his name in Japanese) and Prince Akishino, who are 45 and 40, respectively. Naruhito and his wife, Princess Masako (a Harvard-educated commoner) have one daughter, Princess Aiko, who is four years old. Akishino and his wife, Princess Kiko, have two daughters who are both older than Aiko.
Under current law, the succession would go Naruhito, Akishino, then the current emperor's younger brother. After that, nothing.
So, since Aiko was born, there has been discussion of revising the imperial succession law to allow her to sit on the throne. The debate had been gathering momentum, with several possible proposals on the table. There are two issues -- the immediate successor to the throne, and the succession from that emperor/empress. For the immediate succession, there are two main options:
[1] Succession to oldest child, regardless of gender.
[2] Succession to oldest boy, if there are no boys, then succession to oldest girl.
For the following succession, there are two options:
[A] Succession to emperor/empress's children, regardless of gender of emperor/empress.
[B] Upon the death of an empress, succession reverts back to someone with an emperor on his/her father's side.
Under the 1/A combo, Aiko gets to be empress, and her children will follow her. Under the 1/B combo, Aiko sits first, then if she dies before her girl cousins, they will sit in order of age. When they are all gone, we're back to the current dilemma; the throne would potentially fall to some distant cousin (this would involve expanding the royal family to include more cousins).
Confused yet? Now it gets complicated. (Try drawing out some family trees and numbering them for different scenarios, just for fun.)
What if Kiko's child is a boy? Under current law, he would be third in line for the throne, after his uncle and father. Under the 1/B combo, he might never sit on the throne, but his children would in preference to Aiko's or even his sisters'.
Now, maybe to you, especially if you're, say, English, where things have been done another way for centuries, this might seem bizarre and byzantine. But not here.
Prime Minister Koizumi, who views himself as a reformer but often kowtows to the right wing, had supported a bill picking, I think, option 1/A. (There was a government-commissioned panel to investigate options that concluded a few months ago; I don't remember the details.) After Kiko-sama announced her pregnancy (or, more correctly, the Imperial Household Agency that controls her every movement announced it), Koizumi initially said he would press forward with the bill. But yesterday he backed off from that stance, saying the issue needs to be considered carefully; this is probably a prelude to shelving it at least until the gender of the baby is known.
There's more (including various interpretations of Japan's history with empresses, and a proposal by a distant cousin to revive the concubine system), but I'm out of steam. Google News will help you out if you want more.
I'm sure she's happy, but must be dreading the political firestorm approaching.
What firestorm? Oh, uh, the imperial succession. You see, as the press here likes to phrase it, "No male heir to the throne has been born in forty years." In Japan, under current law, only men are allowed to sit on the throne. The current emperor has two sons, Crown Prince Naruhito (called "Kotaishi-sama" here; you never hear his name in Japanese) and Prince Akishino, who are 45 and 40, respectively. Naruhito and his wife, Princess Masako (a Harvard-educated commoner) have one daughter, Princess Aiko, who is four years old. Akishino and his wife, Princess Kiko, have two daughters who are both older than Aiko.
Under current law, the succession would go Naruhito, Akishino, then the current emperor's younger brother. After that, nothing.
So, since Aiko was born, there has been discussion of revising the imperial succession law to allow her to sit on the throne. The debate had been gathering momentum, with several possible proposals on the table. There are two issues -- the immediate successor to the throne, and the succession from that emperor/empress. For the immediate succession, there are two main options:
For the following succession, there are two options:
Under the 1/A combo, Aiko gets to be empress, and her children will follow her. Under the 1/B combo, Aiko sits first, then if she dies before her girl cousins, they will sit in order of age. When they are all gone, we're back to the current dilemma; the throne would potentially fall to some distant cousin (this would involve expanding the royal family to include more cousins).
Confused yet? Now it gets complicated. (Try drawing out some family trees and numbering them for different scenarios, just for fun.)
What if Kiko's child is a boy? Under current law, he would be third in line for the throne, after his uncle and father. Under the 1/B combo, he might never sit on the throne, but his children would in preference to Aiko's or even his sisters'.
Now, maybe to you, especially if you're, say, English, where things have been done another way for centuries, this might seem bizarre and byzantine. But not here.
Prime Minister Koizumi, who views himself as a reformer but often kowtows to the right wing, had supported a bill picking, I think, option 1/A. (There was a government-commissioned panel to investigate options that concluded a few months ago; I don't remember the details.) After Kiko-sama announced her pregnancy (or, more correctly, the Imperial Household Agency that controls her every movement announced it), Koizumi initially said he would press forward with the bill. But yesterday he backed off from that stance, saying the issue needs to be considered carefully; this is probably a prelude to shelving it at least until the gender of the baby is known.
There's more (including various interpretations of Japan's history with empresses, and a proposal by a distant cousin to revive the concubine system), but I'm out of steam. Google News will help you out if you want more.
Toshiba Chip Plant Investment
Toshiba said it will raise its semiconductor capital expenditure to 289 billion yen ($2.4B at 119 yen/dollar) for the fiscal year ending the end of March. That's a growth of 63B yen ($529M) since some previous announcement, and the bulk of the increase is to go to NAND flash memory production capacity.
Future of Classical Computing HW: FeRAM
Toshiba has announced a 64-megabit FeRAM. It reads and writes at 200MB/sec in burst mode, and incorporates ECC. I'm unclear on when, or if, it will be generally available, and what the price will be relative to flash, SRAM, or DRAM.
FeRAM, as I understand it, replaces the dielectric layer in DRAM with a ferroelectric film, creating "magnetic capacitors" that retain a particular polarization when powered off, making it non-volatile. The similarity with DRAM means it should be able to reach similar densities and price points, and the performance is much more like DRAM than flash, as well.
Back in the old days, there were stories of mainframes with magnetic core memory being powered, disassembled, shipped to a new site, reassembled, and powered on, and returning to execution right where they left off. With FeRAM, we can get similar behavior. FeRAM would aso allow the write cache memory in a host operating system or a RAID controller to be stable, without a UPS or battery backup. Given the rate those fail at, that would be a huge benefit. Peter Chen's Rio file system caching work is one way to go about managing such memory.
A short article comparing MRAM and FeRAM can be found here.
FeRAM, as I understand it, replaces the dielectric layer in DRAM with a ferroelectric film, creating "magnetic capacitors" that retain a particular polarization when powered off, making it non-volatile. The similarity with DRAM means it should be able to reach similar densities and price points, and the performance is much more like DRAM than flash, as well.
Back in the old days, there were stories of mainframes with magnetic core memory being powered, disassembled, shipped to a new site, reassembled, and powered on, and returning to execution right where they left off. With FeRAM, we can get similar behavior. FeRAM would aso allow the write cache memory in a host operating system or a RAID controller to be stable, without a UPS or battery backup. Given the rate those fail at, that would be a huge benefit. Peter Chen's Rio file system caching work is one way to go about managing such memory.
A short article comparing MRAM and FeRAM can be found here.
Saturday, February 04, 2006
Tapping the Greek PM's Phone
The government of Greece is quite mad today, the Independent and others report. It seems that someone hacked into Vodafone Greece's network and tapped the cell phone calls of about a hundred people, including the Prime Minister, several cabinet ministers, and even a U.S. embassy employee, for about a year.
Of course, cell phones are made to be tapped. It's part of the design, especially in the network. The air interface is encrypted so that random eavesdroppers with a radio can't listen in, but when you run the back end network, it's no problem, in theory. Cell phone operators spend millions of dollars complying with government regulations that, in many (most?) countries require the operators to be able to tap phone calls when the government requests it. (In the U.S., it was widely understood, until recently, that the government had to have a warrant to conduct such a tap.)
The news here is that a hacker managed to get control of the network to do this, rather than the government requesting it. Worse, Vodafone simply killed the taps when they found them, without requesting help from the authorities in tracking down the perpetrators. The various media accounts conflict on when Vodafone actually informed the government that the phones had been tapped, but it may have been quite recently, and Vodafone found and killed the taps last March.
Designing the networks to make the tapping doable is actually a lot of work; you'd be amazed at the contortions the system has to go through to support this. It applies to both the circuit-switched and packet-switched sides of the network. This complexity is part of what keeps a cell phone (GPRS/W-CDMA) packet network from being as simple as an ordinary Internet ISP. The equipment is more specialized, slower, more complex, and has fewer customers, all of which contribute to making the equipment exotic and expensive.
None of the articles I've seen explicitly detail the technology used, but since the infrastructure is already there, my assumption is that what was hacked was control of the existing intercept gateways, so that rules or filters could be put in place. There's no need for particularly complex software, unless some was put in place to hide the taps from regular audits. The hacking itself could have been as simple as acquiring a carelessly controlled password, then running a few command-interpreter commands once into the system. It takes a lot of knowledge, but I'll bet the changes to the machines were ultimately very small.
The articles talk about cell phone calls being tapped, they make no mention of data (SMS/email/MMS/web browsing) being tapped.
[Update: Bruce Schneier points to an article that claims it was done by tapping into the conference calling system and making each of the phone calls into a surreptitious conference call. He also says it's Ericsson equipment.]
Of course, cell phones are made to be tapped. It's part of the design, especially in the network. The air interface is encrypted so that random eavesdroppers with a radio can't listen in, but when you run the back end network, it's no problem, in theory. Cell phone operators spend millions of dollars complying with government regulations that, in many (most?) countries require the operators to be able to tap phone calls when the government requests it. (In the U.S., it was widely understood, until recently, that the government had to have a warrant to conduct such a tap.)
The news here is that a hacker managed to get control of the network to do this, rather than the government requesting it. Worse, Vodafone simply killed the taps when they found them, without requesting help from the authorities in tracking down the perpetrators. The various media accounts conflict on when Vodafone actually informed the government that the phones had been tapped, but it may have been quite recently, and Vodafone found and killed the taps last March.
Designing the networks to make the tapping doable is actually a lot of work; you'd be amazed at the contortions the system has to go through to support this. It applies to both the circuit-switched and packet-switched sides of the network. This complexity is part of what keeps a cell phone (GPRS/W-CDMA) packet network from being as simple as an ordinary Internet ISP. The equipment is more specialized, slower, more complex, and has fewer customers, all of which contribute to making the equipment exotic and expensive.
None of the articles I've seen explicitly detail the technology used, but since the infrastructure is already there, my assumption is that what was hacked was control of the existing intercept gateways, so that rules or filters could be put in place. There's no need for particularly complex software, unless some was put in place to hide the taps from regular audits. The hacking itself could have been as simple as acquiring a carelessly controlled password, then running a few command-interpreter commands once into the system. It takes a lot of knowledge, but I'll bet the changes to the machines were ultimately very small.
The articles talk about cell phone calls being tapped, they make no mention of data (SMS/email/MMS/web browsing) being tapped.
[Update: Bruce Schneier points to an article that claims it was done by tapping into the conference calling system and making each of the phone calls into a surreptitious conference call. He also says it's Ericsson equipment.]
Friday, February 03, 2006
Future of Classical Computing Hardware: Plasmonics
A recent issue of Science had several interesting papers possibly relevant to the future of classical computing hardware. I'll try to review them one at a time over the next few days.
Ozbay wrote a review of plasmonics. Surface plasmons (SPs) are electromagnetic waves that are confined to the region near a metal/dielectric interface, theoretically much smaller than the (vacuum) wavelength of the light. You can't quite build a standard wire or even waveguide for SPs; proposed structures include an arrangement of nanoscale gold dots on the surface, so that the total layout creates a waveguide or mirror. Ozbay says that work on interfacing external ("normal") optics to plasmonics is proceeding well. However, plasmonics is being touted as a possible answer to the difficulty of intra-chip interconnect bandwidth and real estate, and that field still needs work (signal losses are apparently a big problem). In Ozbay's words,
But what I don't understand is how this will solve our bandwidth problem. Ozbay mentions that a fiber optic interconnect can carry >1000 times as much data as an electronic interconnect, but I'm not following either the logic behind that or why it translates directly to plasmonic waveguides on a chip. The signalling necessary to convert between plasmonic waves and electronic state is inherently limited in speed. Is it that a plasmonic wire supports faster switching because longer electronic wires have increased capacitance and slower switching times? It's not propagation time, which won't be much different. The structures are not smaller; if anything, they may be larger, and would seem to allow fewer layers of interconnect, so I don't think it's spatially more efficient. Frequency division is one possibility, but I think the plasmonic structures have to be optimized for a particular wavelength. I don't get it. I'm going to have to follow some of the references and figure this out...
The article includes discussion of all-plasmonic chips, plasmonic light sources, and plasmonic nanolithography. The lithography I don't quite grok, either, but a silver superlens helps focus (using near-field effects) lithographic exposure to features much smaller than a wavelength. That looks very promising.
Overall, a fascinating and possibly fundamental shift to how we move data, but it appears to be a long ways to commercial use yet.
Ozbay wrote a review of plasmonics. Surface plasmons (SPs) are electromagnetic waves that are confined to the region near a metal/dielectric interface, theoretically much smaller than the (vacuum) wavelength of the light. You can't quite build a standard wire or even waveguide for SPs; proposed structures include an arrangement of nanoscale gold dots on the surface, so that the total layout creates a waveguide or mirror. Ozbay says that work on interfacing external ("normal") optics to plasmonics is proceeding well. However, plasmonics is being touted as a possible answer to the difficulty of intra-chip interconnect bandwidth and real estate, and that field still needs work (signal losses are apparently a big problem). In Ozbay's words,
[W]hen a lot of data need to travel from one section of a chip to another remote section of the chip, electronic information could be
converted to plasmonic information, sent along a plasmonic wire, and converted back to electronic information at the destination.
Unfortunately, the current performance of plasmonic waveguides is insufficient for this kind of application, and there is an urgent need
for more work in this area. If plasmonic components can be successfully implemented as digital highways into electronic circuits,this will be one of the "killer applications" of plasmonics.
But what I don't understand is how this will solve our bandwidth problem. Ozbay mentions that a fiber optic interconnect can carry >1000 times as much data as an electronic interconnect, but I'm not following either the logic behind that or why it translates directly to plasmonic waveguides on a chip. The signalling necessary to convert between plasmonic waves and electronic state is inherently limited in speed. Is it that a plasmonic wire supports faster switching because longer electronic wires have increased capacitance and slower switching times? It's not propagation time, which won't be much different. The structures are not smaller; if anything, they may be larger, and would seem to allow fewer layers of interconnect, so I don't think it's spatially more efficient. Frequency division is one possibility, but I think the plasmonic structures have to be optimized for a particular wavelength. I don't get it. I'm going to have to follow some of the references and figure this out...
The article includes discussion of all-plasmonic chips, plasmonic light sources, and plasmonic nanolithography. The lithography I don't quite grok, either, but a silver superlens helps focus (using near-field effects) lithographic exposure to features much smaller than a wavelength. That looks very promising.
Overall, a fascinating and possibly fundamental shift to how we move data, but it appears to be a long ways to commercial use yet.
Unwise Microwave Oven Experiments
A Friday afternoon link for you. Looks like fun, but we have a nice, new, microwave/convention oven combo I'm reluctant to destroy. Hmm, my mother-in-law's microwave is getting old, I should find an excuse to buy her a new one...
(I have a whole stack of quantum and classical papers I'm going to post reviews of Real Soon Now, I promise...)
(I have a whole stack of quantum and classical papers I'm going to post reviews of Real Soon Now, I promise...)
Thursday, February 02, 2006
CRA on SoU; and, Musing About the Future
I generally don't do politics on this blog (but I do it passionately via email), but Bush's State of the Union speech is generating a little interest in the computing research community.
The CRA has a response to the "American Competitiveness Initiative". W explicitly mentioned "promising areas such as nanotechnology, supercomputing, and alternative energy sources". He also proposed to "double the Federal commitment to the most critical basic research programs in the physical sciences over the next ten years".
It will be interesting to see how this plays out. Will it mean more money for quantum computing? Fusion research? Particle accelerators? Or will it be more short-term, more applied?
It should be obvious by now that the biggest change currently under way is mobile & ubiquitous systems. If you're (still) working on those areas (I spent four years at Nokia, and think highly of the company as well as the importance of the area), my hat's off to you. Supercomputing, from Google-sized data systems to SETI@home to the grid work at SDSC and ISI, was rightly highlighted. The potential of both of these topics has only just begun to be explored. Both supercomputing and mob/ubiq will change society profoundly in the next decade.
Then what's next? Come 2015, what are the next big topics? My guess is robotics and quantum computing. I don't want to go into all of the reasons right now, but I think robotics is finally about to blossom, and I believe quantum computing is closer to reality than most people realize. In my opinion, robotics/autonomous systems will start having a big societal impact within about a decade (yeah, I know, if you count industrial robots, it already DOES have a big impact, and the research itself has had countless spinoffs). While a useful quantum computer probably will not be on the market, within a decade its debut will seem inevitable, and many, many researchers will be scrambling to be the first to build and deploy such a system.
The CRA has a response to the "American Competitiveness Initiative". W explicitly mentioned "promising areas such as nanotechnology, supercomputing, and alternative energy sources". He also proposed to "double the Federal commitment to the most critical basic research programs in the physical sciences over the next ten years".
It will be interesting to see how this plays out. Will it mean more money for quantum computing? Fusion research? Particle accelerators? Or will it be more short-term, more applied?
It should be obvious by now that the biggest change currently under way is mobile & ubiquitous systems. If you're (still) working on those areas (I spent four years at Nokia, and think highly of the company as well as the importance of the area), my hat's off to you. Supercomputing, from Google-sized data systems to SETI@home to the grid work at SDSC and ISI, was rightly highlighted. The potential of both of these topics has only just begun to be explored. Both supercomputing and mob/ubiq will change society profoundly in the next decade.
Then what's next? Come 2015, what are the next big topics? My guess is robotics and quantum computing. I don't want to go into all of the reasons right now, but I think robotics is finally about to blossom, and I believe quantum computing is closer to reality than most people realize. In my opinion, robotics/autonomous systems will start having a big societal impact within about a decade (yeah, I know, if you count industrial robots, it already DOES have a big impact, and the research itself has had countless spinoffs). While a useful quantum computer probably will not be on the market, within a decade its debut will seem inevitable, and many, many researchers will be scrambling to be the first to build and deploy such a system.
Wednesday, February 01, 2006
Earthquake@home
Experience earthquakes live, in your own living room! No need to travel anywhere!
We just now (8:35 p.m. on Feb. 1) had a 5.1-5.3 (the numbers differ a little) centered essentially 118km directly beneath our house. I would call it a "shindo 2" on the Japanese scale. Mayumi thought it was a bus going past, but it was strong enough to give the girls a pretty good fright. I would guess it lasted about ten seconds.
Man, I love Hi-net. Real-time info.
I post these not because they frighten me (though the bigger ones do give me an adrenalin rush, this didn't), but because they fascinate me. There was a point in time when I considered becoming a geologist. A Caltech roommate and I used to sneak into Mudd, where they kept the remote seismographs, when we thought we felt something. This real-time web mapping is very cool, but there's something visceral about that wiggling needle on the scroll of paper.
I don't grok the sphere/circle info in the lower right hand corner of the picture...
Subscribe to:
Posts (Atom)