Monday, June 06, 2022

Spelunking CACM, vol. 8 (1965): debugging, peephole optimization, and an undergraduate curriculum

(Click on the image if it isn't large enough to read.)

Overall, 1965 seemed like an "ordinary" year at CACM. I didn't see any papers that really caught my eye as some sort of extraordinary event. (Stay tuned for 1966 for that.) But there are a few things of interest, of course.

By 1965, the idea of wide-area networks was definitely in the air. Paul Baran was publishing RAND reports on the key ideas of packet switching (although I'm not sure when those became publicly available), and the ARPANET project would kick off in 1966. But connections between computers were still much more an idea than a reality, and so I.R. Neilsen had a CACM paper on a 5kHz modem for "callup" connections for exchanging medical data.

Evans and Darley presented DEBUG, which worked with other tools to allow smooth insertion of new lines of assembly while debugging a program, preventing what we now call "spaghetti code". DDT already existed for the PDP-1 at this point. I find this early history of debugging intriguing, since I have a Ph.D. student working on debugging for quantum programs now.

Maybe the most interesting trend is a pair of papers (is two enough for a trend?) on optimizing compilers. In June, Nievergelt presented "On the Automatic Simplification of Computer Programs", which proposed a set of backend techniques for individual optimizations to a stream of instructions. The paper enumerates limitations on the use of the proposed techniques, which interesting included the constraint that programs not be self-modifying, evidence that that idea was already in use.

Then in July, McKeeman gave us "Peephole Optimization". He implies that the technique is known but known and "simple" but "often neglected", and also that he is coining the term "peephole optimization" in this paper. Again implemented for the PDP-1, the explanation is clear but limited, being just over a page.

But maybe the most interesting thing as an artifact is the report from the ACM Curriculum Committee on Computer Science titled "An Undergraduate Program in Computer Science-- Preliminary Recommendations". Some earlier papers on education appeared in prior years, but I think this is the first actual organized recommendation from ACM itself. Here was the status at the time:

At the time of this writing, in the United States, doctorates can be sought at more than 15 universities, master's degrees at more than 30, baccalaureates in at least 17 colleges, and a sizeable number of other colleges and universities are presently planning or considering departments of computer science.

Impressive. I didn't know there were that many before I was born.

At the top of this post is the key table. Each of the sixteen courses there has a several-paragraph description that is well worth reading. I'm particularly intrigued that Graph Theory was already recognized as an important element in the curriculum. I am working to introduce a Graph Theory class for undergrads at our campus, for many purposes including economics, social sciences, etc., not just CS.

Each course in that table is assumed to be a U.S. 3-semester-hour course consisting of 3 hours/week x 15 weeks = 45 hours of in-class time and probably 2x that in lab or homework time for a total of, oh, 130-140 hours of work and learning. If you take just the nine "Required" plus "Highly Recommended Electives" courses, you're well over a thousand hours of time invested.  Add the seven "Other Electives", and a university that provided them all (noted in the text to be a difficult task), and you have about 2,100 hours of time. And that's not even counting the "Supporting" courses, let alone any arts and social sciences classes required. In short, this is a pretty thorough and ambitious major!

No comments:

Post a Comment