I have finally reached my twentieth article in this series! It's probably time for another summary, but first let's do the things that caught my eye in 1977. This was the year that "Star Wars" (with the simple title) came out. I was eleven. I probably wasn't yet particularly aware of computers and computing, but I was already a fan of "Star Trek" and of the space program (which was in the Apollo-Soyuz/space shuttle interlude at the time).
When I first skimmed the contents for 1977, I flagged eighteen articles for further investigation. For a systems person like me, there was a lot to catch the eye. I've winnowed it down, but if you look there are still more good papers than just these.
If you're interested in data about demographics and numbers of students in CS, there is a report following on from the previous year. What's depressing is not how bad it was in 1976, but how bad our diversity remains today.
As long as we're on the topic of education, it was already possible in 1977 to put together a survey of 200 publications on CS ed. ACM's first curriculum committee issued its report in 1968, though, and SIGCSE was founded in 1970, so it should be no surprise. The CS GRE was also introduced recently and analyzed in CACM.
Dorothy Denning and Peter Denning give an elegant description of how to certify information flow in programs for security purposes. Consider, for example, the simple statement
If the variables
y and
x are in separate security classes, then there is a potentially unauthorized information flow between the classes. After the execution of this statement, the value of
y will equal the value of
x, whether we considered that acceptable or not. The Dennings go on to discuss a complete method for static, compile-time analysis of this information flow.
Morgan and Levin provided a mathematical means of assigning files to network nodes to optimize overall system performance. The eye-catching hypercube with cut planes at the top of this blog posting is taken from the article. They even consider multiple copies of the files in the networks, and divide traffic into "query" and "update" and consider them separately. They recognize that assigning the files to the nodes is an exponentially complex problem, and provide some heuristics for avoiding searching the entire space.
Speaking of networks, Tajibnapis developed a routing protocol for the MERIT network in Michigan. His article on TIP was originally submitted in 1974, and finally published in 1977. To me, the NETCHANGE protocol specifying the message contents for routing protocols sounds a lot like RIP, which was developed quite a bit later. However, TIP doesn't seem to have had a lot of impact; I'm not sure why. Of course, even by 1974 Dijkstra's shortest path first algorithm was fifteen years old, and there was quite a flowering of work on routing in the mid-1970s, so it's likely that there was other, similar work. MERIT would go on to be a pivotal place for networking research for decades to come.
In operating systems work, Lamport wrote about readers and writers, which I still teach about in my OS class today. Fundamental stuff, though this article is a little...rococo in its notation, I think.
Parallel computing was in its earliest phase of trendiness. While I associate a lot of theory of correctness in parallel programming to K. Mani Chandy, there is a paper here on how to prove parallel programs correct, using Susan Owicki's techniques and an implementation of Dijkstra's on-the-fly garbage collector as the example. As author David Gries says,
Building a program with little regard to correctness
and then debugging it to find errors is even more folly
for parallel programs than it is for sequential programs.
Word.
And finally, if you want some history to go with your history, there are Rabin's and Scott's Turing Award lectures, given for their work on nondeterministic automata some years prior.