Programming Languages (CS302 2006S)

Papers on LISP

Comments on:

Graham, P. (2002). The Roots of Lisp. Web essay at http://lib.store.yahoo.net/lib/paulgraham/jmc.ps (dated 18 January 2002; visited 1 February 2006).

McCarthy, J. (1960). Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I. Communications of the ACM 3(4), pp. 184-195.

Comments from: Peter Brown (*), Michael Claveria, Davis Hart, Brad Miller, Angeline Namai, Mark Nettling, Dimitar Tasev, Dave Ventresca (late).

No comments from: Alex Leach (x)


Notes on eval

On a more technical level, I think I am beginning to understand the eval function a little better, but I guess I am still confused about the power behind it. Both authors say that you can use the eval command to write the language itself, and while this is cool, it seems to start a chicken and the egg scenario in which to write a language, you need to have written the language etc.

I found the reading on lisp to be quite understandable, though I am still curious (from the reading for wednesday) as to how one would create the first compiler for lisp in lisp, even though the language is recursive andyou can define it useing itself, it still has primitives, and those need to be turned into assembler, maybe I'm just misinterpreting what was ment by lisp being the first language to have its compiler written in itself.

The key point is that you can write eval for a more complex language using only a few basic operations. Those you need to implement separately.

I also would not mind if we were over the eval function that Graham presents in class.

Can you go over the eval function in class because I still don't understand it fully.

Certainly.


My initial reaction while reading this was that it was extremely dense and boring. However, after realizing that this was written in 1960, I thought that it was pretty intelligent. He proposes a mathematical reasoning/proof for all the functions of Lisp (some of which are convoluted). While I could have learned how Lisp works by reading the summary of his work in Graham's article, there was more insight in McCarthy's report as to how programming languages are designed intelligently so that the functions of the language will work in theory before implemented. The second part left me a little confused (where he started writing things in all caps), I could not quite comprehend what he was saying. Overall though, it seemed to be very advanced computer science and math proof.

Why are the conditional expressions on page 176 of McCarthy evaluated from left to right when there are multiple true statements? I realize that this is the convention but I'm wondering why it was chosen. Why not the last true statement?

It's more efficient to use the first one. Some languages (e.g., the conditionals you saw from Gries) are explicitly non-deterministic or permit the implementer to choose.

What's the difference between M and S expressions? Is it just that there are lower case letters that distinguish themselves from the upper case ones in the S expressions? How are they used differently?

What are some situations where we would want to use quote to protect a list from evaluation? Why don't many other programming languages use it?

When you want to build a list without typing lots of cons statements. When you have a symbol that has a separate meaning (i.e., that is in the symbol table). When you're generating code for evaluation in another environment. Other languages don't use it because they don't have a first-class symbol type.

As was to be expected, I found the Graham article much easier to read than the McCarthy one. I imagine this was in part due to the time periods in which the respective articles were written as well as the purpose behind the articles. The articles impressed me in two ways. First, by the amount of functions and ideas the modern programmer takes for granted, that McCarthy had to introduce as foreign concepts in his paper. Things such as conditionals and functions and propositions, which may have previously existed on a theoretical level, first came to the computer, it seems, through LISP. The second thing that impressed me was how similar the papers are when it came to terms and ideas. Clearly, Graham was intending to present a more modern day version that was easier to follow than McCarthy's original document, and he does so very well, however the thing that impressed me was how, even after 40 years, Graham still used much of the original language, suggesting that he feels it is the easiest way to explain it.

McCarthy's article clarified the basic premise of LISP and Scheme which I have missed since day one of CS at Grinnell. The mathematical definitions neatly transformed into a programming language helped me see the simplicity with which LISP was first developed. However, its design and transformation into a programming language lack the fundamental element of readability. I, for one, find it difficult to follow the stacks of parentheses on paper and screen. The syntax of the language encourages programmers to write code all on one line, rather than one line per function. Finally, I wonder how the efficiency of this language was impacted by the fact that it originally contained few functions. Could the language have been extended to include more built-in functions without damaging its simplicity?

McCarthy discusses the higher-order functions, apply and maplist, and implies that LISP's support for such functions significantly reduces the amount of time a programmer takes to factor code. In the Revenge of the Nerds, Graham also claimed that one line of LISP can replace 20 lines of C, and although he did not mention higher-order functions, I believe that they are a factor in the speed of development and conciseness of code written in functional languages. It therefore appears that development time underlies both McCarthy's and Graham's judgment of LISP as a good programming language. However, from CS201, I know that C has the fastest execution time, but I also know that C cannot support programs nearly as complex as some of those written in Scheme or other dialects of LISP. Is it then safe to conclude that there exists a trade-off between execution/run time and development time?

Well, pure assembly is faster than C. You can also write a LISP interpreter in either. One of the silver bullets of software engineering is that we can get both speedier development and speedier execution. It seems the case that a modern optimizing compiler can generate better assembly than all but the best assembly programmers, so we do get both.

In The Roots of Lisp, Graham says that one of [McCarthy's] key ideas was to use a simple data structure called a list for both code and data. I do not quite understand what Graham is emphasizing as a distinguishing feature of LISP. Is it the fact that both code and data take the same form, or is it the fact that code and data take a simple form?

Graham and McCarthy both clearly think that a LISP-like programming language (or possibly LISP itself) is the top of the food chain as far as computer science is concerned. If this is so, then why hasn't LISP become the modern ideal? What I mean is, why haven't we adopted LISP (or a LISP-like language) completely, if it is indeed possible to code a solution to any problem in LISP? Both authors mention that modern programming languages strive toward being more LISP-like, so then what's the point of other languages?

This seems to be more a reaction to the first set of readings than the second. Remember, McCarthy was writing in 1960, and so was comparing LISP to things like Algol-58 and FORTRAN. I think Graham would suggest that there's little point to the non-LISP languages. We haven't adopted them because, as many of you have seen, it seems that many people can't grasp the subtleties of the language/paradigm.

Another question that comes to mind is, why did McCarthy write his description of LISP is such an unreadable fashion? Graham is able to communicate basically the same information in a manner that is simple to understand. I think part of it has to do with his notation and the transparency of his examples.

Different audience. Computer scientists in 1960 expected formal, mathematical definitions. (Many still do today.)

I noticed too that there are several things about LISP that have been changed in Scheme. Is there a specific reason that Scheme was significantly altered from the original form of LISP?

Some aspects of LISP are hard to define formally (particularly dynamic scoping). Scheme was designed to be cleaner, and to incorporate some new ideas from the intervening years of language design. LISP had also segmented into a host of dialects and, I expect, by choosing a new name, the designers hoped to make it clear that there should be only one dialect of Scheme.

There are also several expressions that both Graham and McCarthy use in their papers that I was unclear about: defun, label, the 'f' notation, and the difference between quote or ' in LISP and ' or list in Scheme.

Paul Graham's The Roots of Lisp offers a compact but efficient display of Lisp's potentials as a programming language. By discussing its seven primitive operations, one is assured that all (or almost all) functions can be represented in this language. The paper does not discus the efficiency of these primitives, but suggests a greater advantage in complex data manipulation for Lisp programmers. The addition of label allows the user to be able to successfully represent recursive functions, and furthermore, the language is also able to define eval., an interpreter of its own. Even though Graham states that the purpose of his paper is to show where [programming] languages are heading, I am still not convinced that is the case with the programming languages of today. Many of the programming languages widely-used nowadays are still ages behind Lisp's philosophy of dealing with data recursively. One of those issues is represented by a plethora of recursion based programs, which even though they can be written in no more than few lines of Lisp code, they can hardly be implemented in other programming languages.

Disclaimer: I usually create these pages on the fly, which means that I rarely proofread them and they may contain bad grammar and incorrect details. It also means that I tend to update them regularly (see the history for more details). Feel free to contact me with any suggestions for changes.

This document was generated by Siteweaver on Wed May 10 09:03:17 2006.
The source to the document was last modified on Mon Feb 6 09:22:45 2006.
This document may be found at http://www.cs.grinnell.edu/~rebelsky/Courses/CS302/2006S/Readings/lisp-2.html.

You may wish to validate this document's HTML ; Valid CSS! ; Check with Bobby

Samuel A. Rebelsky, rebelsky@grinnell.edu