Monday, July 07, 2025

I Believe in Web3...Just Not That Web3

(Note: This was originally written in summer and fall 2022, and for various reasons I decided not to publish it then, despite the obviously enormous amount of work I put into it. One reason was that I wasn't completely satisfied with it, so I still consider it to be a work in progress. Given the enormous volumes of writing out there dedicated to the broad topic of web3, it would also be rather bold of me to think that I have much new to offer. This is, instead, a way to organize my thoughts, which I am willing to share with you. If you are kind, I will be happy to carry on a dialog that might improve my understanding, as best my time permits.) 

I believe in the re-decentralization of the web. I want creators to be paid and I believe in the idea of micro-payments, and it would be nice if that meant something other than advertising. I love Larry Lessig's idea of managing the incentives that web2 companies have to create addictive, outrage-driven products. Although the Russian aggression in Ukraine brings the whole liberal project into question, I still think increased flow of goods, people, knowledge and principles across borders leads us toward a better world. And Joi Ito's description of web3 as being about community brings it into focus, gives it a direction.

What I don't believe in is most of the technologies being touted as transformative and necessary to web3.

What I don't believe in is blockchain as a currency, a store of value, or a speculative investment. I do believe it potentially has value as a public record of certain communications. It is also a brilliant technical innovation, still searching for the right way to be applied.

I very definitely don't believe in NFTs. I can't see what value they add at all.

And the metaverse...ah, the metaverse.

Backing up for just a second, blockchain, NFTs, and virtual reality/metaverse are the rather disparate technologies that are getting welded together and touted as the cure for everything that ails today's World Wide Web (and there is a lot that ails the web). Collectively called web3, the purported win is that they enable DeFi (decentralized finance, in contrast to CeFi or the "fiat economy", which is never used as a compliment), DAOs (distributed autonomous organizations), and more.

Supposedly.

And supposedly, in at least some tellings, not only do these technologies solve what ails the web, they solve some fraction of all the world's problems.

Which means, I suppose, we need to begin with a quick look at what some of those problems are and how we got here before we look more closely at the technologies on offer. 

The Vision

Among quite a number of things I read, the longest and most coherent was Joi Ito's book in Japanese, 「テクノロジーが予測する未来」(tekunorojii ga yosoku suru mirai, or The Future that Technology Predicts, more or less), which I read most of. It partly inspired this posting, so I will be referring to it quite a bit, but I don't want this to be just an analysis of Joi's arguments. The book is very focused on the notion of community and the ability to quickly create and scale up new communities. Joi is well known for being quick to "try on" new ideas, always looking for something to remake the world, so his thoughts are interesting even if not always as grounded in technological or human feasibility as we might wish.

Joi describes web1.0 as read, web 2.0 as write, and web3 as join (or maybe "participate"). This takes us from 1.0's monolithic web servers whose installation and maintenance and publication required significant technical expertise and capital, through blogs (and the early days of online shopping), to 2.0's Facebook-dominated world of SNSes where anyone can write or share photos far more easily and a handful of hypergiant e-commerce corporations control what we search for, buy, read and use. Today, the buzzword web3 is supposed to help us build large-scale, global, autonomous communities, with as little regard for existing prejudices, practices, rules and laws as we can get away with.

Joi describes project-based organizations that come and go like movie productions. Given the chaos, stress, grift and uneven distribution of rewards in moviemaking, I'm not so sure that's an attractive description. It also sounds like a macro-scale gig economy, where many people have to hustle for every dollar, and I am certain that is not the right model for everyone (though it may be for some).

The word efficiency comes up repeatedly. To Joi, this seems to be the heart of what these new technologies bring, and it is a seductive Siren. After all, the worldwide web itself is "only" a more efficient way of publishing and sharing information. If a new efficiency really takes hold, it can transform the world. 

Ownership of not only the things you create but the things you buy is frustratingly difficult to even understand, let alone manage, today. Amazon can delete things from your Kindle, and prevents you from selling them on to others. HP can remotely disable the printer you bought and paid for, if your credit card that must be filed with them expires. John Deere has...a complicated relationship with the right to repair something you own. (Okay, now we're getting a little far from web3.) John Deere also lays claim to the data that their Internet-connected farm equipment collects. One of the principles of web3 is to return ownership of data to those who generate it.

The Internet is sometimes touted for its "permissionless innovation" (which, of course, is how we got the web itself -- no one had to give Tim permission to deploy the first web server). I would called DAOs "deferred legality". Spin up a quick bulletin board or server or Github project for the community to meet, invite others in, establish a handful of rules on how decisions are made (By humans? By an algorithmic "smart contract"? How do you allocate weight in voting?), how work is distributed and how people are paid (in tokens, presumably, whose utility for buying real-world groceries may vary), and you're in business -- maybe literally selling something, maybe just collectively creating something fun.

Joi has a chapter on how web3 will advance education. His first two points are about the gamification of education, and proof of credentialing. I'm all in favor of learning being fun, and if setting it up as a quest makes it easier for today's kids to follow and finish a plan, that's fine. For credentialing, inflated resumes or outright forged diplomas are a real-world problem, so I think it makes sense to have some sort of digital certificate with cryptographic non-tampering authentication that can easily be checked by prospective employers, ideally single-click from a submitted resume.

web3 is touted as a way to disintermediate a lot of creative businesses and see that creators get paid more directly for their work. Some people even talk about it as a means of improving digital identity, reducing forgery and impersonation and outright theft of digital identities. It is even described as supporting better democracy.

I'm really not sure I've done the vision justice, but at best it's amorphous and so far this is the best that I can do.

The Technology

Is there really a need for me to describe the tech here? Let me toss in a short description anyway.
Blockchain is a distributed ledger, with recorded entries that are proposed by an individual node (is "client" the right word here?), then committed using a large-scale distributed computation, involving many nodes (see below on "Getting Squeezed"). After being committed globally, a record can be more quickly confirmed using a local computation. To paraphrase one of my students, it's a single, world-writable, world-readable database. Anyone can record essentially anything on the blockchain; as it happens, the first proposed use was as an ersatz form of currency that people are expected to exchange for real-world goods.
NFTs are...well, what are they??? An NFT is just a token on a blockchain that represents...represents...that you, um, bought something digital, like an artwork, from someone, who may or may not have retained the copyright and may or may not make more of them and in any case even if you are the one and only legal owner of the digital artwork you can't stop someone else from copying it so if you want to really enforce your legal rights to it, go ahead but I don't think anyone has actually tested this legally yet.  (Whew. Ugh. I know. This really belongs below in the discussion of what's wrong with the technology, or maybe just below here in the section on legal matters, but I can't find any positive way to describe what an NFT is. Want a better explanation? Google it up yourself. There are a zillion people out there eager to explain to you why you should spend money on NFTs, but far fewer who can really tell you what one is and what you are actually buying.)
All right, I give up on the technical explanation unless you want me to talk about hashing rates and cryptographic security and the like. If you find anyone who can give you the Heilmeier catechism on either blockchain or NFT, let me know.

The Legal Matters

We actually just covered some of the legal issues of NFTs, sort of, so let's look at DAOs in the real world.

Joi touts the innovative, token-based governance of DAOs, but it's really not clear how contract disputes, labor disputes, legal liability, taxation, and adherence to international norms should be enforced. Environmental and workplace regulations (not to mention rent, equipment and insurance of all forms) are pushed to the individual participants. Given that the vast majority of DAOs will remain too small to bother with, it can be argued that the deferral of resolving legal status is the right approach, but it's worth noting that this even includes the matter of legal jurisdiction.

Of course, I am not the first to think of any of this. In fact, Wyoming already has a law on the books on how to incorporate an LLC for a DAO. Tennessee also has such a law, signed into law in April 2022, and aims to be the "Delaware of DAOs". Services to help you set up a Wyoming DAO abound, and they encourage you to put in as much real money and to clarify these issues as completely as you can. Of course, the more completely you specify these things up front, the more it looks like a conventional small company with employee ownership, but there do seem to be differences. Consider, for example, a smart contract used to make decisions with impact outside the immediate group, such as buying or selling something. What if the transaction happens to be illegal in one or more jurisdictions? Who can be held liable, and how can the DAO be modified to make sure it doesn't happen again?

The Fit: Does the Tech Do the Job for Web3?

In a word, no.

Transactions

In meatspace, transactions differ in complexity and longevity. Buying a house requires much more paperwork than buying a donut. Not only is there more at stake financially, the world at large cares much more and for much longer (centuries, even) that I bought a house. It is natural that real estate transaction processes are more complex, and very little of that complexity has to do with the difficulty of physically or electronically signing, attesting and publishing documents or even moving the bits that represent large bank balances.
In the middle of the complexity and longevity scale are our everyday purchases. Lattes, gas, train and movie tickets, lunch. Many of us, worldwide, make several purchases a day, and a system that aims to handle all of this therefore needs to handle, oh, say, a hundred billion transactions a day.
At the far end of the spectrum is the ephemeral, ubiquitous, continuous cascade that is billions of people surfing the web. To build a micropayment system that pays content providers as you surf their contributions, we need to target something that scales to literally around a trillion transactions a day
How do we do this today? Well, via true distribution of the work involved. The vast majority of transactions worldwide are small, two-party transactions that don't need a true, worldwide, globally readable record. Maybe the coffee shop needs to summarize its sales at the end of the day, and the money in your possession needs to be transferred to them at a mutually agreed-upon exchange rate of coffee to currency. (In fact, that's part of what currency actually is: a medium of exchange, a store of value, and a unit of accounting.  It's such a common trope that there are Cliff Notes for it.) That money "in your possession" can be anonymous cash in your pocket, e-money stored in a card such as a Japanese Suica, in your bank account and accessed through a network such as Visa, or credit offered to you by a company such as Visa. It does need to prevent you somehow from double-spending your money, which is more of a challenge when it is held as bits than as some sort of physical token such as a bill or coin.
Bitcoin runs at around ten million transactions a month. By design. Its performance is some seven decimal orders of magnitude lower than what we need. If we ran Bitcoin for the age of the Earth, we could cover about one day's worth of web browsing. And that's just the computation for the payments. Imagine how much storage is needed!
And Bitcoin is, by design, global. That transaction rate wouldn't be a problem if it represented only a single coffee shop -- nobody sells ten million cups of coffee a month. So systems that are working to improve scalability of Bitcoin or other blockchains are really trying to solve a problem that Bitcoin introduced. The approaches that I have seen, including ideas such as "lightning networks", seem to regain an order of magnitude or at most two. This still leaves us with a system a billion times slower than what we need. The only way to get there is to create true distributed systems. Maybe not so unlike the one we already have, despite all its flaws...

Education

I don't see much overlap between web3 goals and education.

Radical transparency, as in David Brin's The Transparent Society, is an interesting notion, staking out one extreme of the design space. But I don't believe that students should have their every move out there for everyone in the world to see. Students need room to make mistakes, to find themselves, learn and change and grow without worrying what a potential employer will think. Yes, I know they are oversharing on SNSes 24x7 and have a different view of the whole process than my generation does, but that doesn't mean every homework set, poorly thought out essay, dispute with a professor and bad grade needs to be recorded on a ledger for all the world for all posterity. FWIW, seeing their github activity seems like a positive thing when looking for skills and accomplishments, but even that needs to be put in the context of the whole human and their experience.
Joi goes on to talk about completely revamping education, especially in Japan, around self-driven, purpose-based learning and what amounts to micro-credentialing. I'm in favor of empowering learners, and there is a chance to rethink how we structure learning, but I don't see how this connects to web3 at all. (Moreover, micro-credentialing takes away from the curriculum structure, and I believe in the value of a structured body or knowledge both for engineers and as citizens. Much more on this topic some other time.)

DAOs and Human Organizations

If you needed any persuasion that blockchain doesn't really help the already disadvantaged, this might do it: although cryptocurrencies are touted as being a way to share wealth and empower impoverished people, today the distribution of wealth is more uneven in cryptocurrencies than in real-world, fiat currencies and stores of value such as real estate. 

Instability, Scams and Ponzi Schemes, Burning up the Earth, and Other Such Minor Issues

Surely there is nothing more to be said by this point; a lot of people have pointed out problems with both blockchain and NFTs. Most importantly (in line with #1 in the list in the next section), I think it is absolutely unconscionable the amount of energy expended every day for such a small number of transactions. Proponents have talked for years about shifting from proof of work to proof of stake, but a) it doesn't seem to be happening, and b) proof of stake appears to exacerbate some governance and consensus problems.

Arvind Narayanan had a nice thread on blockchain, quoting a blog posting by Bruce Schneier. Bruce and the others he links to cover the core arguments pretty well, so I am not going to reiterate them all here. Bruce is one of the signers of the letter to Congress urging regulation of crypto finance. Quoting just a couple of Arvind's tweets,

[B]lockchain has so far proven useless. Worse, it's proven a costly distraction to people and communities who are trying to solve real problems...I can't tell you how many times I've talked to energetic students with great ideas about what's wrong with our institutions who, in a sane world, would be working on fixing our institutions but instead have been seduced by the idea that you can replace them with smart contracts.

Let me address just a few of the issues that I think haven't gotten as much attention as they deserve, at least in the set of things I have been reading.

Getting Squeezed

Miners join a network voluntarily, with the idea that they provide a service that others will pay for. The single biggest problem in this libertarian paradise, from what I can tell, is that very aspect. With only a handful of miners worldwide, miners could make a comfortable living and the environmental impact would be low. But as long as there is profit to be made, new miners will join, driving up the collective mining rate and increasing competition, such that the probability of an individual miner receiving the payout for successful mining goes down. Thinner margins will mean that only those who can efficiently run large-scale operations can afford to stay in business. As a result, there is a massive explosion in worldwide mining capability that today damages the environment and distorts the market for semiconductors, but that can't hold indefinitely. I figure that ultimately there will be a crash or consolidation of miners such that we end up with the Walmart of miners, with everyone else driven out of business. (We may be seeing this already.) Or, perhaps the better analogy is Subway franchisees.

Instability

"Pump and dump" and other scams abound, but an even bigger issue, if possible, is whether the system itself actually works and can be trusted to always work when we need it. As I write this in summer 2022, the last several months has seen a lot of instability in the cryptocurrency markets, even in the so-called "stable coins" that are supposed to be pegged to a currency such as the dollar, but in reality are still vulnerable to the fundamental issues of liquidity (one of the key concerns expressed by Paul Blustein) and whether or not someone actual wants to buy what you are holding at a time when you have little choice but to sell.

Lately I have been seeing ads for automated crypto trading accounts. Even in the highly regulated world of stock trading, algorithmic trading is potentially the most destabilizing technology introduced since the stock ticker itself. Lots of agencies oversee the large operators, and economists in every major bank, government and trading house must be scrutinizing the situation and looking for positive feedback loops that can cause markets to gyrate out of control. They have also instituted "circuit breakers" in case major problems develop. The financial crisis of 2008 may have shown that small investors can and will still lose their entire investment, but the existence of the big houses and the regulators reduces the number of scams and provides some (emphasis on "some") recourse when troubles occur. The cryptocurrency market has none of this.

Scrip and Taxes

A question: are DAO tokens just company scrip? I don't think so, but there are enough similarities to be disquieting. In the early 20th century U.S., with human mobility rising rapidly but not yet easy and the megalopolises not yet a majority of the population, many remote, small towns were essentially one-industry, even one-employer, towns. Employers often paid employees in scrip instead of U.S. dollars. Scrip could buy goods, at inflated prices, at the company store, or be exchanged for dollars at disadvantageous rates.

DAO tokens feel a little like scrip, a little like getting paid in stock. Tokens can be exchanged for goods, but only within a limited community. Because you aren't physically limited, you can shop anywhere that will take your tokens/scrip, but to participate in the broader economy you have to find someone who will change your tokens for money that works in your local economy.

Of course I understand that there are different kinds of tokens (some say as many as six), some of which are closer to currency and some of which are closer to voting stock. But if part of the design is to maximize liquidity and the velocity of the economy, won't even the stock-like tokens be traded rapidly and likely wind up concentrated in a few hands? I'm pretty unclear on how the dynamics of all of this is supposed to work out, how it is likely to work out, and what the failure modes are. But I'm pretty doubtful that DAO tokens and cryptocurrencies with little value behind them are ultimately stable.

Speaking of trading and the economy, if you are paid for work in some DAO's tokens, which you can trade for goods, when does it become taxable income? When it gets exchanged for local fiat currency? What if that is never?

Vision, Revisited

Okay, with all that under our collective belt, let's revisit the vision. I actually like the idea that our organizing principle should be community. Here's what that means to me, in terms of technical challenges:
  1. Climate change and sustainable development. Without solving these things there is no community, no human security, no meeting of basic human needs.
  2. Data and information systems security. As IT people, this has to be Job One. Add in general systems stability (CIA = confidentiality, integrity and availability), and this is far more important than some random, clever new feature.
  3. Establishing personal autonomy, privacy and empowerment. Note that, to liberal me, this does not imply leaving people out there on their own, with no support.
  4. DEI. (To the extent to which it's different from the above.)
  5. The next billion. Kilnam Chon has a talk with the title Future Internet for the Other Billions. Ever since its inception, the Internet has always faced the challenge that adding the next group of connected people has meant reaching groups that are less technologically literate and perhaps poorer, and with greater environmental and infrastructure challenges of all sorts.
  6. Technologically, the end of Moore's Law. Estimates of data center energy consumption range from 70 TWh/year to three times that, or around 1% of the global total electric power generation. Although our efficiency has increased dramatically, we, as an industry, still have work to do.
If you are working on one of those issues, kudos to you. If you aren't, but you do other work that you find rewarding, that's great, too. But if you're working on something that actively leads away from solutions to the above, or if you're fooling yourself that somehow the current proposed web3 technologies do any of the above, I would encourage you to rethink, carefully considering the big picture.

Final Thoughts and Notes

..wait, I've gotten all the way to the end here without mentioning "the metaverse", or virtual reality worlds. Maybe that means...it's really a separate thing?  FWIW, I was intrigued by VRML all the way back in the mid-1990s. It seemed like a good idea at the time. Whether the tech could keep up was another matter. I suppose, eventually, we will have Snow Crash-style virtual reality, but not yet. People -- including many who are unhappy with their real-world circumstances -- will probably like it. But other than a place to hang your virtual art you spent a lot of tokens acquiring, it's not clear to me that it's either necessary or sufficient for web3. (There have been lots of novels and short stories set in virtual worlds in the last four decades or so. One recent one I read is A Beautifully Foolish Endeavor, with a villian I swear is modeled on Jordan Peterson.)

Since I have invested two decades in working on a technical area or two whose real-world value has yet to be proven, I'm sensitive to criticisms of blockchain that it's a hammer in search of a thumb to hit. I have worked in large companies, in startups, and in academia in both Japan and the U.S., but people like Joi and many, many serial entrepreneurs have far more experience than I do, so you should probably trust them more than me on what works well and what doesn't. 

One friend of mine said, "It's so complex it's hard to see how stupid it is." This friend has spent far more time studying all of this than I have. Until recently, I had not invested much effort in studying web3, and I am sure it shows. I will endeavor to keep learning, and may update this posting. If so, I will add a change log at the bottom.

A final note: I recently had a conversation with one of our students, Shaimay Shah, who is due to graduate momentarily. He said (quoted with permission),

I think the web3 tech is my generation's way to maybe make a difference...I want to look back 20-30 years down the line and tell my kids that I made a difference to society.

That's the goal, and it's a good one. The question is, which of society's problems can be solved via web3, and what tools will take us there?

References

A few of the things that I read that are at least moderately helpful, some from true believers and some from skeptics, some linked to inline above but most not. Apologies for the kind of ragged formatting and lack of inline citations above. Even deeper apologies for not having them organized in any useful fashion, this list grew organically.
  1. Nick Weaver, "The Web3 Fraud", my favorite technical criticism on the technical costs of running a web3 site.
  2. Dave Farber and Dan Gillmor, Cryptocurrencies Remain a Gamble Best Avoided.
  3. https://www.msn.com/en-us/news/technology/bored-ape-yacht-club-the-nft-collection-that-s-becoming-a-real-offline-brand/ar-AAQT8yv
  4. https://www.vice.com/amp/en/article/y3v3ny/all-my-apes-gone-nft-theft-victims-beg-for-centralized-saviors
  5. https://moxie.org/2022/01/07/web3-first-impressions.html
  6. NFT Mona Lisa
  7. https://internetcomputer.org/
  8. https://dfinity.org/
  9. https://www.technologyreview.com/2020/07/01/1004725/redesign-internet-apps-no-one-controls-data-privacy-innovation-cloud/
  10. https://www.stephendiehl.com/blog/against-crypto.html
  11. NFTs being stolen:
  12. https://twitter.com/arvalis/status/1468814628276695041
  13. https://docseuss.medium.com/look-what-you-made-me-do-a-lot-of-people-have-asked-me-to-make-nft-games-and-i-wont-because-i-m-29c7cfdbbb79
  14. Inequality in crypto:
  15. Ashish Rajendra Sai, Jim Buckley, and Andrew Le Gear, "Characterizing Wealth Inequality in Cryptocurrencies".
  16. Khristopher Brooks, "Bitcoin has its own 1% who control outsized share of wealth".
  17.  David DSHR Rosenthal, "Can We Mitigate Cryptocurrencies' Externalities?"
  18. https://twitter.com/doctorow/status/1493288001107021826
  19. Irving’s blog posts: 
  20. https://blog.irvingwb.com/blog/2022/04/what-is-web3.html
  21. https://blog.irvingwb.com/blog/2021/12/the-metaverse-the-next-major-phase-of-the-internet.html
  22. https://www.mollywhite.net/annotations/latecomers-guide-to-crypto
  23. Links from Mike: (1) Useful background reading about the future of the metaverse: https://www.eurasiagroup.net/live-post/the-geopolitics-of-the-metaverse   (2) especially the first and last sections. https://moxie.org/2022/01/07/web3-first-impressions.html
  24. https://conversationalist.org/2020/03/05/the-prodigal-techbro/
  25. https://decrypt.co/100687/ethereum-creator-vitalik-buterin-contradictions-web3-values
  26. https://twitter.com/VitalikButerin/status/1526378787855736832
  27. Joi Ito 伊藤穰一 on web3 town Yamakoshimura 旧山古志村
  28. Another article on web3 town Shiwa-cho, a town of 33,000 people in Iwate-ken.
  29. https://ssir.org/articles/entry/the_good_web#
  30. https://ethereum.org/en/developers/docs/web2-vs-web3/ 
  31. https://inrupt.com/solid/p frustratingly short on details, but Tim Berners-Lee's vision sounds pretty good.
  32. Ethereum consortium, Web2 v. Web3. From Ethereum's point of view, of course, but pretty level headed. What's missing is whether Ethereum is either necessary or  sufficient, or even a way, to achieve the goals laid out.
  33. https://www.zenbusiness.com/how-to-start-a-dao/
  34. Cointelegraph: Deconstructing sidechains — The future of Web3 scalability.
  35. https://cointelegraph.com/news/deconstructing-sidechains-the-future-of-web3-scalability
  36. https://asia.nikkei.com/Opinion/Bitcoin-will-be-remembered-as-a-historically-insignificant-fallacy
  37. https://doctorow.medium.com/moneylike-d20f8279a72e
  38. https://blog.makerdao.com/the-different-types-of-cryptocurrency-tokens-explained/
  39. https://www.fsa.go.jp/en/policy/bgin/ResearchPaper_qunie_en.pdf
  40. https://www.nasdaq.com/articles/what-is-ethereum-name-service-and-how-do-you-get-a-.eth-web-3.0-domain
  41. https://www.researchgate.net/publication/260438995_Trends_in_worldwide_ICT_electricity_consumption_from_2007_to_2012
  42. https://www.nature.com/articles/d41586-018-06610-y
  43. Morgan Ames, "Laptops Alone Can't Bridge the Divide," a must-read on the failure of technology alone to solve problems in education.
  44. Folding Ideas, "Line Goes Up -- The Problem with NFTs", a two-hour, fast-paced dissection of NFTs and blockchain both. Mostly clear and mostly calm, but occasionally heavy on the jargon and slips into occasional fits of outrage. Surprisingly watchable, despite the length and topic.
  45. https://nymag.com/intelligencer/article/three-arrows-capital-kyle-davies-su-zhu-crash.html
  46. https://web3isgoinggreat.com/
  47. https://www.nature.com/articles/s41598-022-18686-8

Change History


Thursday, May 15, 2025

Modern-Day Optical Network Physical Signal Encoding


(PAM-3 eye pattern; image from Wikipedia)

This blog posting is still in editing, and is posted just so I could talk to some students about elements of the contents.

Recently, I posted about how SDH optical networks encode bits at the physical level. My interest in the topic stems from a desire to know how to multiplex classical and quantum information on different channels/wavelengths, and part of that involves a basic understanding of the classical signals on the fiber. In both the demonstration network we are building and in the longer term as experimental specifications develop into standards, we may choose to put classical synchronization signals and the like for the quantum signals into the same fiber, or we may decide to carry full-on classical data traffic.

Of course, SDH is rather old now, going back to the 1990s, and optical networking has advanced considerably, especially for data center and local area networks. The most obvious place to look for newer developments is Ethernet, so here we are. First, let's look at almost-but-not-quite leading edge networks, where the technological decisions are more settled than in, say, Ultra Ethernet, which is still under development. (I do hope to come back to Ultra soon, but the draft specs are currently closed to the public.)

[tl;dr: PAM-4, with four distinct signal amplitudes, is common. Development of Ethernet using 16QAM was suspended after a draft specification was developed but not approved. As far as I can tell there is no standardized use of quadrature amplitude modulation in the optical regime, though it's common in RF.]

Many of Ethernet specifications are available for free from IEEE, including 802.3db-2022 - IEEE Standard for Ethernet - Amendment 3: Physical Layer Specifications and Management Parameters for 100 Gb/s, 200 Gb/s, and 400 Gb/s Operation over Optical Fiber using 100 Gb/s Signaling, and 802.3df-2024 - IEEE Standard for Ethernet Amendment 9: Media Access Control Parameters for 800 Gb/s and Physical Layers and Management Parameters for 400 Gb/s and 800 Gb/s Operation which mention PAM but not QAM.

400Gbps Ethernet has 11 separate physical layers that run over fiber (one still in development), two twisted-pair copper and one backplane form.  Let's focus on the fiber variants, since our interest here is photons in fibers (sometimes many of them, sometimes only one). Five of the variants listed at Wikipedia use PAM-4 (sometimes written PAM4 in the page), one is listed as 16QAM (but more below), and the rest don't say; perhaps they are simply on/off NRZ keying, like in earlier optical networks like SONET/SDH.  (It's nice that this information was much easier to find than the original SONET/SDH stuff!  Partly because I better understand what I'm looking for this time, I suppose.)

...So what are PAM-4 and 16QAM?

PAM, or Pulse Amplitude Modulation, is pretty straightforward: instead of using signal ON at full power and OFF to represent a single bit, if you use several voltages (or RF or optical signal strengths), then each symbol can represent more than one bit.   A related term is Amplitude Shift Keying; I'm not sure exactly why the Ethernet folks stick to PAM instead of ASK. The picture at the top of this posting is PAM-3: the level at the bottom, the level in the middle, and the level at the top. The sloping lines are transitions from one level to another; the cleaner those lines are, the bigger the "eye" is, indicating that your circuit is very stable. (If you don't know how to read an eye diagram, you should learn.)

PAM-4 uses four signal levels, carrying two bits per symbol. Prolabs and Samtec have nice explanations of PAM-4, including eye diagrams. I mentioned above that there are many different physical layers for Ethernet; PAM-4 is used in the 400GBASE-DR4, 400GBASE-FR8, 400GBASE-LR8, 400GBASE-FR4, and 400GBASE-LR4-6 variants, all running over four or eight single-mode optical fibers working together. (More on that multi-fiber concept some other time.)

That was easy. Whew! ...but what about 16QAM? Gotta mind your Ps and Qs...same thing? Nope! QAM is a lot more complicated, involving a lot of signal processing theory we're not going to get into in this blog, but let's take a quick look anyway.

QAM is Quadrature Amplitude Modulation. In QAM, we modulate the signal using two separate waves, one sine term and one cosine term. When using on-off keying or PAM on optical fiber, our carrier signal is just the laser light amplitude, and the system is relatively insensitive to the phase of the light. With QAM, however, the phase of the carrier is critical.

One of the most basic types of QAM is 16QAM.  When first getting oriented, I found this web page to be helpful, but keep in mind that it's talking about the use of 16QAM for radio signals, not optical. In 16QAM, we use two signals, I and Q, that are both sinusoids, 90 degrees out of phase. Each of those two signals is modulated to carry two bits, then the two are combined, so that I+Q is the signal transmitted. The modulation involves two choices of amplitude, and two choices of phase -- either 0 or 180 degrees added to the already 0 or 90. Since there are four possible choices for each of I and Q, we have a total of 16 possible waveforms, hence 16QAM. The 16 waveforms are shown below.


The wild thing is that you can recover the four bits cleanly from that.  I won't go into the math here, but you can check Wikipedia.

Those sixteen waveforms are arranged in a particular way, corresponding in order to what is called a constellation diagram. The constellation diagram helps you understand the noise in the system and figure out whether or not your hardware can reliably reconstruct the original set of bits.

A helpful, if not super-high quality, video:
https://www.youtube.com/watch?v=6BIqEWEe5-I

The 400Gbps Ethernet variant 400GBASE-ZR was supposed to use 16QAM. BUT:

The 802.3cw website says, "The work of the IEEE P802.3cw 400 Gb/s over DWDM Systems Task Force concluded with the withdrawal of IEEE P802.3cw PAR on 22 May 2024." Apparently, they published P802.3cw/D3.0, Dec 2023 - IEEE Draft Standard for Ethernet Amendment: Physical Layers and Management Parameters for 400 Gb/s Operation over DWDM (dense wavelength division multiplexing) systems as "Active - Draft" in Dec. 2023.  It's not available in the freely accessible documents yet (which only include approved standards at least six months old, as I understand it), and even my university account seems to be unable to reach it.  Too bad, that's definitely where I wanted to be looking!  https://www.ieee802.org/3/dj/public/24_03/motions_3cwdj_2403.pdf says the PAR (Project Authorization Request) was withdrawn by unanimous consent on 13 March 2024. This was foreshadowed a month earlier in an email from John D' Ambrosia, chair of the TF.

According to the Wikipedia page on terabit Ethernet, it was proposed in 802.3cw to use dual-polarization 16QAM, which might add an extra bit but sounds even more complicated to me.

I don't know yet where the carrier for reconstructing the signal comes from...if it's just the laser itself, that's about 200THz for 1.5um light, and we need some reference to find the right phase for the carrier. One research paper on carrier recovery:
https://opg.optica.org/jlt/abstract.cfm?uri=jlt-27-15-3042
No idea if that's what's used...

And another survey paper, heavily cited in papers and in (at least) 20 patents:
https://ieeexplore.ieee.org/abstract/document/5464309








As a companion to this posting, I am developing a Jupyter notebook on 16QAM that made the plot above.








Tuesday, May 06, 2025

Julia Parsons, "Code Girl", 1921-2025


(Image taken from Seattle Times, where it is credited the World War II Foundation.)

 The New York Times reported last week that Julia Parsons passed away. (The Seattle Times has a copy not behind a paywall.) She was probably the last living member of the WWII Naval Communications Annex team responsible for deciphering Enigma messages sent to and from German U-boats. She joined the WAVES in 1942, right after graduating from Carnegie Tech (now Carnegie Mellon University), and was assigned to work in the unit from 1943 through the end of the war.

I don't recall if she was mentioned by name in Liza Mundy's Code Girls, but she was definitely part of that crew.  If you haven't read that book, you really should.

As one of the youngest members of the group, her initial task was to work directly on the deciphering of the messages from the U-boats. She worked with the US Navy Bombe, feeding it possible plaintext and ciphertext. The Bombe would then produce a "menu" of possible Enigma wheel settings that had to be checked to determine which (if any) of them would correctly decrypt the message. The Wikipedia article has an excellent description of the workflow.

Because the work she did was classified, she didn't talk to anyone about it until 1997, when she discovered it had been declassified in the 1970s. We probably lost a lot of history that way, as even by the 1970s many of the senior people involved had doubtless passed away.

Thank you, Ms. Parsons, for what you did for democracy and freedom. I know it came with a cost.

Wednesday, April 23, 2025

Basic Signal Modulation in SDH Optical Networks



 Last night, I had an extremely basic question on how signals get onto an optical fiber, so I started looking for information on it, including the standards themselves.  Despite starting from what I think of as a reasonable base of knowledge, it took me over an hour to find the answer, so I figured I'd write it down here, for both myself and posterity.

[tl;dr: At least for the set of specs I looked at, it's simple on-off keying with NRZ (non-return to zero) encoding, though RZ is also possible; that is, the light being on is a 1, and light being off is a 0. The ones I looked at don't use exotic things like phase-shift keying or the like.]

I already knew that I wanted to look at the standards for SONET or SDH (Synchronous Digital Hierarchy), which are essentially the same, and I knew that ITU-T is the organization that handles SDH. ITU-T is a standards organization, part of the ITU, which is an agency of the UN (although ITU itself is much older than the UN, go figure).  It handles a lot of aspects of communication, but the part of interest here is Series G: Transmission systems and media, digital systems and networks.  (Simply determining that this is the set of specs to look at took me a long time.  Google doesn't really index into the ITU-T pages very well, and I wasn't familiar with the "Series" structure of the ITU-T standards.  Moreover, when Google does give you a direct link to the file, it's often an older version instead of the correct, up-to-date one.) (It's also worth noting here that these are published as "Recommendations", and don't have the force of law unless adopted into a law by some country.)

It was pretty easy to find information on frame formats for STM-1, which is the key transmission format for SDH.  Getting from there down to the physical layer encoding was the next step, where I stalled.  Here's what I eventually found, most of them in the G.95x Digital line systems series:

  • G.691 : Optical interfaces for single channel STM-64 and other SDH systems with optical amplifiers 
    This spec isn't part of the G.95x series but it has more on physical layer, eye mask, rise times, power levels, dispersion accommodation, etc.
  • G.955 : Digital line systems based on the 1544 kbit/s and the 2048 kbit/s hierarchy on optical fibre cables
    This is actually pretty old (1996), but hasn't been withdrawn.  There are lots of values for allowable attenuation, etc. that were simply listed as "under study".
  • G.957 : Optical interfaces for equipments and systems relating to the synchronous digital hierarchy
    One of the most important, but be careful; it has been updated, so there are multiple versions floating around and the older one doesn't say "superseded".  You want the 200603 version, dated 03/2006. This one also talks about dispersion, if that's your gig.
  • G.959.1 : Optical transport network physical layer interfaces
    Ah, finally, here's the money!!!  A few simple lines in Sec. 8.2.2.13: "The convention adopted for optical logic levels is:
    • − emission of light for a logical '1';
    • − no emission for a logical '0'."
G.959.1 also refers to "optical tributary signal class NRZ 2.5G" and several similar terms.  Knowing to Google for those may help in the future.  Also, this is where I got the eye mask figure (a "mask" of acceptable values for the eye diagram) at the top of this posting.

This still leaves me with some questions:
  1. How is clock recovery done in the NRZ system?
  2. How are frames demarcated?
  3. How do you turn a laser diode on and off that fast in an electrical circuit?
  4. Definitely need to study up more on filtering, DWDM (especially how close channels are allowed to be), and add/drop devices.
There are also newer optical networks, including the most advanced forms of today's Ethernet, and I'd like to look at the same questions about them. I'll come back to all of these another time.

Oh, and for what it's worth, check out the Y series, where you'll find Recommendations on Quantum Key Distribution, Internet of Things, and other leading-edge topics.

[Edit: The book Optical Networking Standards: A Comprehensive Guide for Professionals was published back in 2006. I found it after writing this blog post, but I have found it useful. My university has access to the PDF, yours may, too.]

Saturday, April 05, 2025

Spelunking CACM, Vol. 23 (1980): Pilot and Medusa



The structure of the magazine is changing, but not the covers, which are still primarily black and blue, with some abstract design. The articles are longer and more in-depth, and each issue has only a few articles. Some of the older types of notices and articles, such as short algorithm descriptions, have been retired. And then rather suddenly around August or perhaps a little earlier, the article format itself changed and became more modern, with a nicer header and a three-column, large-magazine format.

Apparently there was a thing called the IBM Programmer Aptitude Test, which I have never heard of.  A quick check didn't find a copy of it online, but some discussion seems to indicate that it was a lot of math word problems. A couple of researchers tested it rigorously, and found, surprise, very weak correlation between the test result and grades in an introductory FORTRAN programming course. They also found an even weaker correlation between gender and performance.

Harold Abelson and Peter Andreae wrote about tradeoffs in VLSI design. Interestingly, this is the first time I recall seeing the term "VLSI", though maybe it just didn't catch my eye before. The term itself should have been only about three years old at the time (according to Lynn Conway's reminiscences), and yet the article doesn't bother to expand the acronym.

Not only was there a chess tournament featuring a dozen programs at a 1979 ACM conference, there was also an early attempt at a man-machine team, what we might now call "centaur" or "advanced" chess, playing against (in this case) a lone human. The article authors were relieved that the lone human won.

Xerox Business Systems (not PARC?) authors published about Pilot (the figure below), an OS for personal computers complete with a single 32-bit virtual address space.  (In fact, we might call it a 33-bit address space today, since addresses were to 16-bit words.) Pilot used a flat (non-hierarchical) namespace for files, each of which had a 64-bit unique identifier they refer to as a capability. The capability is supposed to be unique in space and time, across all machines. Frustratingly, the article doesn't contain much on the hardware required to run Pilot, but it's implemented in Mesa and very closely tied to that language.  Inter-process communication can be either via shared memory or the communication libraries provided, which primarily focused on the PUP protocol suite, though the describe similarities to the ARPANET protocol suite. Like TCP/IP, PUP includes internetworking concepts in it.

As long as we're talking operating systems, maybe the more interesting one technically is Medusa, which ran on the Cm* multiprocessor   (the figure at the top of this posting), developed at Carnegie Mellon University. The article by Ousterhout et al. describes an extremely sophisticated OS running on a distributed shared memory system. The hardware, like a NUMA system today, can directly access local or remote memory, with up to about a 10x latency penalty. The processors are LSI-11s, a version of the workhorse PDP-11 that was used for so many things over two decades and several hardware and OS iterations.

A task force includes activities, with the former roughly resembling a modern process and the latter corresponding to threads, where each activity has a specific role in the overall program/utility -- except that each activity is bound to a processor, but a task force can apparently consist of activities on many processors. One approach, shown below, is to have the many utilities (daemons, in modern terms) of the OS each running on a separate processor.

Arguably Medusa echoes some aspects of Farber's DCS and in turn influences things like VAXclusters, though neither of those systems had direct hardware access to remote memory, as far as I know/recall.

Guy Steele and Gerald Sussman described a LISP microprocessor.  Interestingly, despite the fame of that pair, this article hasn't been cited much; perhaps the commercial LISP machines of just a few years later don't really owe much to it?

Also, I hadn't realized that there were formal attempts to verify security in an OS that far back -- and using capabilities, to boot.

Enough for now. Once again, this is turning into a catalog rather than a dive into one or two pleasing papers, but it's intriguing to see so much on distributed OSes showing up. What a time it was, and I was still too young to participate at all, even though some of this major work was taking place driving distance from my parents' house, in Pittsburgh. If I had known then about that work, and that I wanted to do computing systems, I might very well have gone to CMU instead of Caltech, and how different my life would have been then. Although my life later converged with many good people from CMU!

Cross-validating Quantum Network Simulators

 New paper on the arXiv. I'll be presenting this one at an INFOCOM workshop in London next month.

During this cross-validation process, we not only fixed bugs in both simulators, but we gained a deeper understanding of the performance differences caused by protocol design differences.

Cross-Validating Quantum Network Simulators

Joaquin Chung, Michal Hajdušek, Naphan Benchasattabuse, Alexander Kolar, Ansh Singal, Kento Samuel Soon, Kentaro Teramoto, Allen Zang, Raj Kettimuthu, Rodney Van Meter

We present a first cross-validation of two open-source quantum network simulators, QuISP and SeQUeNCe, focusing on basic networking tasks to ensure consistency and accuracy in simulation outputs. Despite very similar design objectives of both simulators, their differing underlying assumptions can lead to variations in simulation results. We highlight the discrepancies in how the two simulators handle connections, internal network node processing time, and classical communication, resulting in significant differences in the time required to perform basic network tasks such as elementary link generation and entanglement swapping. We devise common ground scenarios to compare both the time to complete resource distribution and the fidelity of the distributed resources. Our findings indicate that while the simulators differ in the time required to complete network tasks, a constant factor difference attributable to their respective connection models, they agree on the fidelity of the distributed resources under identical error parameters. This work demonstrates a crucial first step towards enhancing the reliability and reproducibility of quantum network simulations, as well as leading to full protocol development. Furthermore, our benchmarking methodology establishes a foundational set of tasks for the cross-validation of simulators to study future quantum networks.

Monday, March 03, 2025

Open Faculty Positions at Keio's Shonan Fujisawa Campus

 We have five, count 'em, five, open faculty positions in Keio's Faculty of Environment and Information Studies. If you are a researcher looking for a tenure-track position, please consider applying.

This call is very open; people of all stripes are encouraged to apply. As chair of the Cyber-Informatics Program, of course, I am hoping we will have the opportunity to hire some first-rate people in computing (defined very, very broadly).

Applications are due the end of March, I believe.

https://www.sfc.keio.ac.jp/en/employment/


Saturday, March 01, 2025

Africa and Foreign Aid Today



A week or so ago, as the scale of the USAID cuts was becoming clear, an acquaintance on Facebook gleefully posted a meme about Africa. At best it callously suggested that U.S. aid to Africa is having no effect. A more critical interpretation of the intent was the racist message that Africans have always lived in grass huts and always will, that it's a basket case not worth caring about.
I'm no expert on Africa; I have never set foot on the continent (unlike a number of friends here, some from the continent itself). I know only what I learned in college four decades ago (where I studied under the brilliant Ned Munger and the equally brilliant Thayer Scudder), what I read, and what I hear from working with students, postdocs and collaborators from Egypt, Eritrea, South Africa and Senegal. But if I don't speak up, who will?
First off, of course, Africa is not just one thing. Those four countries I just named are probably at least as different as Finland, Romania, Greece and the U.K. But Africa is often roughly divided into sub-Saharan Africa and North Africa (often lumped in with the Middle Eastern Islamic countries), and the links I include below follow that division.
Africa today is a dynamic and growing place. Over the last three decades, GDP across the continent has tripled, outpacing the growth of the U.S. economy, which doubled over the same period. An increasing number of countries have reached World Bank middle income or upper middle income status; check the links at the bottom of this post.
Africa is urbanizing rapidly. Currently around 45% of the continent lives in cities. This comes with its own problems of sanitation, water, electricity, pollution, transportation and general governance; no one would pretend it's perfect. But the idea of Africans all living in huts is...well, you know.
Life expectancy is also growing rapidly in many countries. Some countries have levels that rival those of some European countries.
These metrics vary dramatically across the continent and correlate strongly with the quality of governance. This is an area where European and American countries are particularly responsible, thanks to centuries of colonial rule and suppression of local voices, and the devastation to West African societies caused by the slave trade (yes, that ended long ago, and the effects persist today). Colonialism has only ended within our lifetimes. The colonies were set up by Europeans without regard to existing cultural, political and economic structures, and some African countries that were amalgams of various cultures were left unprepared to deal with the issues of governance.
Africa is full of ambitious, smart, creative, hardworking people. To get a glimpse, check out the CNN programs Inside Africa and African Voices Changemakers. (I wish CNN had programs as good on Asia!)
How much of all of this progress is due to U.S. foreign aid, I can't say. But PEPFAR alone has saved the lives of millions of Africans, and work done by NGOs on topics such as guinea worm eradication have improved quality of life for all.
We should all celebrate when any country or region grows and improves on these metrics, and worry and offer a hand when they don't.
We are all in this together. If you don't know how to care about other human beings, I don't know how to explain it to you.