/* ____ __ __ __ _____ / __ )/ /___ _____/ /__/ |/ / / / __ / / __ \/ ___/ //_/ /|_/ / / / /_/ / / /_/ / /__/ ,< / / / / /___ /_____/_/\____/\___/_/|_/_/ /_/_____/ */ head[ title[Recap of John McCarthy's Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I] h3[Judith Lindemann] h5[Berlin, 25 December 2013] ] h1[Preface] This text is originated as an exercise for an university course about scientific writing at the Beuth University of Applied Sciences Berlin. The assignment was to choose a computer science paper, reproduce the key ideas in own words, and add some own thoughts about that topic as conclusion. I have selected the classical paper "b[Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I]" by John McCarthy from 1960 (id[LISP]), because it permits a fascinating look into the history of programming languages and is the origin of many concepts that are still relevant today. This text is also influenced by Paul Graham's article "b[Roots of Lisp]" from 2002 (id[ROOTS]) about that McCarthy paper. I follow Paul Graham's approach to provide code examples in actual LISP code instead of m-expressions, and I assume that c[quote] and c[cond] are elementary functions. toc[Contents] sec[Introduction][ The paper (id[LISP]) describes a dynamic typed and functional programming language called LISP. The name LISP is an abbreviation for b[LIS]t b[P]rocessor, which is a very suitable name, because the whole syntax is completely based on a simple list notation for code and data. LISP was developed in 1958, two years before the paper was published. The main purpose for the development was the lack of appropriate programming languages for artificial intelligence applications. At this time FORTRAN was the dominant high level programming language, but it was developed for numeric calculations and engineering tasks and therefore no good fit for AI problems. LISP was influenced by IPL (Information Processing Language), which was an experimental programming language from 1957 (see id[IPL]). IPL was dedicated to AI research, but also inappropriate because it was an assembly language. Some of the IPL concepts that LISP had adopted and heavily improved were: list-processing, higher-order functions, recursion and computation with symbols. Some other concepts were new, for example: conditional control flow, garbage collection, lazy evaluation, and dynamic typing. At first, we will learn something about the mathematical concepts behind LISP. Then, we will see that the early LISP had only two simple data types. After that, we will define 5-7 elementary functions and we will use them as building blocks to create our own functions. Then, we will see how the memory management works. At the end, we will look, how LISP was doing in the past 55 years and how LISP is doing today. ]/* Introduction */ sec[Mathematical concepts][ sec[Propositional expressions][ Propositional expressions are expressions whose values are either c[T] "true" or c[F] "false". These expressions are often combined by connectives like c[∧] "and", c[∨] "or" and c[¬] "not". Typical examples are: math[$$x < y$$ $$(x < y) \land (b = c)$$] ]/* Propositional Expressions */ sec[Conditional expressions][ The notation of conditional expressions was a new concept, developed by McCarthy in 1960. It is the ancestor of the "if...then...else" condition, who is part of nearly every programming language nowadays. Conditional expressions allow a recursive definition of functions in a convenient way. A conditional expression has the form: math[$$(p_1 \rightarrow e_1,\cdots,p_n \rightarrow e_n)$$] The b[p]’s are propositional expressions that are true or false. The b[e]’s could be any kind of expression. One could read "if b[p]sub[1] then b[e]sub[1], else if b[p]sub[2] then b[e]sub[2], ..., else if b[p]sub[n] then b[e]sub[n]" or "b[p]sub[1] yields b[e]sub[1], ..., b[p]sub[n] yields b[e]sub[n]". The b[p]’s get evaluate from left to right. When the first true b[p] is found, then the conditional expressions returns the b[e] that belongs to the b[p]. math[$$(1 < 2 \rightarrow 4, 1 > 2 \rightarrow 3) = 4$$ $$(2 < 1 \rightarrow 4, 2 > 1 \rightarrow 3, 2 > 1 \rightarrow 2) = 3$$ $$(2 < 1 \rightarrow 4, T \rightarrow 3) = 3$$ $$(2 < 1 \rightarrow {0 \over 0}, T \rightarrow 3) = 3$$] The whole conditional expressions is undefined: ol[ - if all b[p]'s are false, - if an undefined b[p] occurs before a true b[p] occurs - or if the b[e] that belongs to the first true b[p] is undefined it self ] math[$$(2 < 1 \rightarrow 3, 4 < 1 \rightarrow 4) \mbox{ is undefined}$$ $$({0 \over 0} < 1 \rightarrow 3, 1 < 4 \rightarrow 4) \mbox{ is undefined}$$ $$(2 < 1 \rightarrow 3, T \rightarrow {0 \over 0} )\mbox{ is undefined}$$] ][COND]/* Conditional expressions */ sec[Recursive function definitions][ With the help of conditional expressions it is easy to define recursive functions. The factorial of a non-negative integer b[n] could be described as follows: math[$$n! = (n = 0 \rightarrow 1, T \rightarrow n \cdot(n - 1)!)$$] The evaluation of 0! returns 1. The evaluation of 2! looks as follows: math[\\begin{eqnarray*} 2! &=& (2 = 0 \\rightarrow 1, T \\rightarrow 2 \\cdot (2 - 1)!)\\\\ &=& 2 \\cdot 1!\\\\ &=& 2 \\cdot (1 = 0 \\rightarrow 1 T \\rightarrow \\cdot (1 - 1)!)\\\\ &=& 2 \\cdot 1 \\cdot 0!\\\\ &=& 2 \\cdot 1 \\cdot (0 = 0 \\rightarrow 1, T \\rightarrow 0\\cdot(0-1)!)\\\\ &=&2\\cdot1\\cdot1\\\\ &=&2 \\end{eqnarray*}] ]/* Recursive function definitions */ sec[Lambda calculus][ The Lambda calculus is a formal notation, which is used in LISP to generate new functions and to use functions as arguments. It was introduced by Alonzo Church in 1941 (see id[LAMBDA]). Church distinguishes between forms and functions. An expression like im[$y^2 + x$] is a form. An expression like im[$f(3, 4)$ ] a function. im[$y^2 + x$] is not a function because the expression im[$y^2 + x(3, 4)$] does not determine and could turn into 19 or 13. The problem is that the order, in which the arguments 3 and 4 are inserted into the form, is undefined. To convert a form into a function we can write: is $2.50 for the first one, and $2.00 for each additional one math[$$\lambda((x_1, \cdots, x_n),\cal E)$$] im[$\cal E$] is a form and im[$x_1, \cdots, x_n$] are the ordered parameters for im[$\cal E$]. The λ-expression is a function because the variables in im[$\cal E$] can be substituted with arguments in the order of the parameter list im[$x_1, \cdots, x_n$]. We say that the variables of a λ-expression are bounded. The example from above looks now like this: math[$$\lambda((x,y),y^2 +x)$$] And with arguments like this: math[$$\lambda((x,y),y^2 +x)(3,4) = 19$$] If we want to define a recursive function like math[$${\rm sqrt}(a,x,\epsilon) = (|x^2 - a| < \epsilon \rightarrow x, T \rightarrow {\rm sqrt}(a, {1 \over 2}(x + {a \over x}),\epsilon))$$] in lambda notation math[$${\rm sqrt} = \lambda((a,x,\epsilon),(|x^2 - a| < \epsilon \rightarrow x, T\rightarrow {\rm sqrt} (a,{1 \over 2}(x + {a \over x}), \epsilon))),$$] we found that these definition is inadequate, because the right-hand side im[$sqrt$] can not serve as an expression for the whole function. Remember, a function would look like im[$sqrt(a,x,ε)$]. In order to define recursive λ-expressions, we must introduce a new notation. math[$$label(f,\cal E)$$] b[f] can be seen as the function name. The occurrence of b[f] within im[$\cal E$] will be evaluated to the label-expression as if b[f] is a parameter of the function. math[$$label(sqrt, \lambda((a,x,\epsilon),(| x^2 - a| < \epsilon \rightarrow x, T \rightarrow {\rm sqrt} (a, {1 \over 2}(x + {a \over x}),\epsilon))))$$] ][LAMBDACALCULUS]/* Lambda calculus */ ]/* Mathematical concepts behind Lisp */ sec[S-expressions and data structures][ sec[S-expressions][ The whole syntax of LISP is made of symbolic-expressions or shorter s-expressions. All code and data can be expressed by the following syntax elements. S-expressions are formed by these characters: c[(], c[)], c[.], and an infinite set of distinguishable atoms. The only data types in this early version of LISP are atoms and lists. ]/* S-expressions */ sec[Atoms][ Atoms are numbers or strings of characters. These strings are often symbols and represent an other atom or list. If LISP code gets interpreted or compiled, symbols get replaced with their referenced data items. code[lisp][1 12 a ab aaaaabb t nil] ]/* Atoms */ sec[Lists][ sec[Pairs][ Pairs are also called cons cells or cons. They are expressed in this form: code[lisp][(1 . 2) (a . b)] A pair has two pointers to values. The first pointer is called c[car] and the second pointer is called c[cdr]. We will see later why this is relevant. img[pair.svg] cap[Fig][Memory representation of two pairs] ][PAIR]/* Pairs */ sec[Singly linked lists][ Pairs can be used to build more complex data structures like singly linked lists. code[lisp][(1 . (2 . (3 . nil)))] img[list.svg] cap[Fig][Memory representation of a list] We can see that the c[car] pointer points to the first element of a pair and that the c[cdr] pointer points to the second element, which is the rest of the list structure. Like c[(1 . rest)], but the c[rest] is the sublist c[(2 . (3 . nil))]. The sublist c[(2 . (3 . nil))] is also a pair of c[(2 . rest)], where the c[rest] is c[(3 . nil)]. The list notation can be abbreviated. code[lisp][(1 2 3)] Through the missing c[.] dot pairs and lists are distinguishable. Now we give some examples of lists in their abbreviated and their dotted-pair notation. code[lisp][(1) (1 . nil) (a b) (a . (b . nil)) (1 2 (3 4)) (1 . (2 . ((3 . (4 . nil))))) (1 2 . 3) (1 . (2 . 3)) ((a b) c) ((a . (b . nil)) . (c . nil)))] ]/* Singly linked lists */ sec[Trees][ Some data structures can also be seen as binary trees. code[lisp][((1 . 2) . (3 . 4))] img[tree.svg] cap[Fig][Memory representation of a tree] ]/* Trees */ ][LISTS]/* Lists */ ]/* S-expressions and data structures */ sec[Functions][ sec[Introduction to functions][ In the original paper McCarthy didn't used s-expressions to describe LISP functions, but m-expressions, which stands for meta-expressions. M-expressions are very similar to the standard notation of functions. They take s-expressions as data and to avoid ambiguity they have brackets instead of parenthesis. A m-expression function could look like this: math[$$function\[(A \cdot B); x\]$$ ] M-expressions, which describes LISP functions, are called s-functions. However, to use these s-functions for computation, they had to get converted into pure s-expressions. Because the converted s-functions are much better readable in their s-expression notation, and because the additional level of indirection is unnecessary, m-expressions were almost never used for real LISP programming. For this reason all examples from the McCarty paper had been converted into runnable LISP code. Common Lisp was the best choice for this purpose, because it implements, in contrast to Scheme or Clojure, all functions we will take a closer look at in a moment. Now we give a very short introduction to the usage of functions. This is not part of the McCarthy paper, but helpful to follow the Common Lisp examples. LISP functions are also lists and looking as follows: code[clojure][(operator arg1 arg2 ... argn)] Some simple examples are given with arithmetic operators: code[clojure][> (+ 1 2) 3 > (* 2 (- 6 3)) 6] Operators gets applied from left to right. c[(+ 1 2 3 4 5)] is equal to c[1 + 2 + 4 + 5]. Nested expressions get evaluated at first. c[(+ (* 3 1) (* 2 3))] is equal to c[(+ 3 6)]. This notation for arithmetic operations is also known as Polish notation and was invented by Jan Łukasiewicz in 1928 (see a[http://en.wikipedia.org/wiki/Polish_notation][Wikipedia]). To distinguish data from function calls we use a single quote c['] to mark expressions as data. This prevents them from getting evaluated. The c['] is an abbreviation for c[(quote x)]. c[x] can be every expression. code[lisp][> (1 2) Error ;; Because 1 is not a function name > '(1 2) (1 2) > (quote (1 2)) (1 2) > s Error ;; Because s is undefined > 's ;; Undefined symbols must be quoted s > (defparameter s 7) ;; Defines variable s > s 7] ]/* Introduction into functions */ sec[Elementary functions][ McCarthy defines in his Paper 5 elementary functions: c[atom], c[eq], c[car], c[cdr], and c[cons]. This functions are implemented as assembly code and can not become substituted by other expressions. We will see later, how we can use these functions to build new functions with them. h1[atom] c[(atom x)] returns c[t] "true" if c[x] is an atom. Otherwise it returns c[nil]. c[nil] represents falsity and is equal to the empty list c[()]. code[lisp][> (atom 1) t > (atom '(1 2)) nil > (atom ()) t] h1[eq] c[(eq x y)] returns c[t] if c[x] and c[y] are the same atom. Otherwise it returns c[nil]. code[lisp][> (eq 1 4) nil > (eq 'a 'a) t > (eq nil ()) t] h1[car] c[(car x)] expects that c[x] is not an atom, but a list. It returns the first element of c[x]. c[car] stand for "b[c]ontents of the b[a]ddress part of b[r]egister". We have seen in section id[PAIR] that the first pointer is called c[car] too, because he points to the first element of the list. code[lisp][> (car '(1 . 2)) 1 > (car '(a b c)) a] h1[cdr] c[(cdr x)] expects also that c[x] is not an atom, but a list. It returns all elements after the first element of c[x]. c[cdr] is the abreviation of "b[c]ontents of the b[d]ecrement part of b[r]egister". The c[cdr] pointer of a list points to everything after the c[car] pointer. code[lisp][> (cdr '(1 . 2)) 2 > (cdr '(a b c)) (b c) > (cdr '('(a b) c)) (c)] h1[cons] c[(cons x y)] creates the new pair c[(x . y)] out of c[x] and c[y]. c[x] and c[y] can be atoms or lists. c[cons] stands for b[cons]truct. code[lisp][> (cons 1 2) (1 . 2) > (cons 1 '(a b c)) (1 a b c) > (cons '(1 2) '(a b)) ((1 2) a b) > (cons '(1 c) '(3 . 3)) ((1 c) 3 . 3)] Some examples how c[car], c[cdr] and c[cons] could interact: code[lisp][> (car (cons 1 'a)) 1 > (cdr (cons '(1 2 3) '(a b c))) (a b c) > (cons (car '(1 2)) (cdr '(a b c))) (1 b c)] h1[cond] c[cond] is not directly one of the 5 elementary functions from the paper. McCarthy has introduced it implicitly as an conversion rule from m-expressions to s-expressions. According to Paul Graham (see id[ROOTS]) c[cond] is so essential that it can be seen as a sixth elementary function. c[(cond (p1 e1)...(pn en))] is used to control the program flow. It works like described in section id[COND]. code[lisp][> (cond ((atom 'a) 1) ;; if 'a is an atom then return 1 ((eq 1 1) 2) ;; else if 1 is equal to 1 then return 2 (t 0)) ;; else return 0 1] h1[quote] For c[quote] applies the same like for c[cond]. It was only introduced implicitly, but it is also this important that it can be seen as the seventh elementary function. We already know how c[quote] works, but for the sake of completeness: c[(quote x)] takes every expression c[x] and prevent its evaluation. It can be abbreviate with c['x]. code[lisp][> (quote (cons 1 1)) (cons 1 1) > '(cons 1 1) (cons 1 1)] ]/* Elementary functions */ sec[Defining functions][ c[lambda] and c[label] provide a notation to define our own functions. c[lambda] is used, like described in section id[LAMBDACALCULUS], to bind arguments on an anonymous function. The s-expression of the lambda calculus looks like this: c[(lambda (p1...pn) e)] and a function call with arguments has this form: c[((lambda (p1...pn) e) a1...an)]. code[lisp][> ((lambda (x y) (cons x y)) 1 2) (1 . 2)] For function calls of c[lambda] we need, like in section id[LAMBDACALCULUS], the c[label] notation c[(label f (lambda (p1...pn) e))]. Since c[label] is unavailable in Common Lisp, we give no code example and just abbreviate c[(label f (lambda (p1...pn) e))] to c[(defun f (p1...pn) e)]. c[defun] is a Common Lisp macro that defines a function that uses these form: code[lisp][(defun (...) )] Now we have the power to create all computable functions! ]/* Denoting functions */ sec[Creating new functions][ McCarthy gives in his paper some examples of new functions like c[ff], c[subset], c[equal], c[null], c[cadr], c[caddr], c[list], c[among], c[pair], c[assoc], c[sublis], c[sub2], c[apply], c[appq], c[eval], c[maplist], and c[search]. With his small examples, he prepares the bigger c[eval]/c[apply] example. He proves with this functions that he can build a complete LISP interpreter out of the elementary functions in a few lines of code (see Paul Graham's Common Lisp version of it id[EVAL]). We will pick some interesting smaller examples, to show the capabilities of this early version of LISP. h1[Logical connectives] With the help of c[cond] it is easy to build c[(not p)], c[(and p q)] and c[(or p q)]. c[p] and c[q] can be every expression. c[not], c[and], and c[or] will return c[t] or c[nil]. code[lisp][> (defun not (p) (cond (p nil) (t t))) > (not t) nil > (defun and (p q) (cond (p (cond (q t) (t nil))) (t nil))) > (and t t) t > (defun or (p q) (cond (p t) (t (cond (q t) (t nil))))) > (or nil t) t] h1[Recursive functions] c[(append x y)] takes two lists and returns them concatenated. code[lisp][> (defun null (x) (eq x nil)) > (defun append (x y) (cond ((null x) y) (t (cons (car x) (append (cdr x) y))))) > (append '(1 1) '(a a)) (1 1 a a)] c[append] consists of two conditional clauses. The first condition c[((null x) y)] is the termination condition and returns c[y] if c[x] is c[nil]. The second condition c[(t (cons (car x) (append (cdr x) y)))] gets always called, except c[x] is c[nil] and does a couple of things: ol[ - it c[cons]tructs a new list from the c[car] of c[x] and the result of c[(append (cdr x) y)] - it calls c[(append (cdr x) y)] recursively and builds a stack with the c[cdr]'s of c[x] until c[(cdr x)] returns c[nil] - if c[(cdr x)] returns c[nil], a list from the last element of c[x] and c[y] will be c[cons]structed ] Now we show a part of the stack for c[(append '(1 2 3) '(6 6))] to illustrate how the recursion works in detail: code[lisp][(append '(1 2 3) '(6 6)) (cons 1 (append '(2 3) '(6 6))) (cons 1 (cons 2 (append 3 '(6 6)))) (cons 1 (cons 2 (cons 3 (append nil '(6 6))))) ;; x is nil (cons 1 (cons 2 (cons 3 '(6 6)))) (cons 1 (cons 2 '(3 6 6))) (cons 1 '(2 3 6 6)) '(1 2 3 6 6)] At the point where c[x] is c[nil], the termination condition c[((null nil) '(6 6))] gets invoked, and the last call of c[append] returns c['(6 6)]. Then the stack gets reduced to the resulting list c['(1 2 3 6 6)]. h1[Higher-order functions] Higher-order functions are functions that take other functions as arguments, or functions that return functions. c[(maplist f x)] takes the function c[f] and the list c[x] as arguments, and applies c[f] on every element in c[x]. For this example McCarthy introduces arithmetic operators like c[(+ a1...an)], and c[(* a1...an)], but he didn't give any further explanations how they are realized internally. code[lisp][;; In Common Lisp > (defun maplist (f x) (cond ((null x) nil) (t (cons (funcall f (car x)) (maplist f (cdr x)))))) > (defun double (x) (* x 2)) > (maplist #'double '(5 6)) (10 12) ;; "funcall f" and "#'double" are "special" CL syntax ;; and theoretically unnecessary. ;; The syntax of Scheme is much cleaner: ;; In Scheme > (define (maplist f x) (cond ((null? x) null) (#t (cons (f (car x)) (maplist f (cdr x)))))) > (define (double x) (* 2 x)) > (maplist double '(5 6)) (10 12)] ]/**/ ]/* Functions */ sec[Memory management][ sec[Representation of s-expressions by list structures][ We have already seen in section id[LISTS] how simple s-expressions are represented in the memory. Now we will see how more complex s-expressions can be stored. The subdivided rectangle represents a register that stores two pointers. This notation was introduce by Allen Newell and J.C. Shaw in 1957 (see id[IPL]). Substructures can occur in more than one place. The lists c[(1 3)] and c[(2 3)] share the pair c[(3 . nil)]. This has the advantage that subexpressions that occurs in more than one place must only stored once. img[share.svg] In some cases it is possible that a subexpression that occurs more than once is represented in more than one way. This is dependent on the history of the program. The list c[((1 . 2) . (1 . 2))] could be an example. img[history.svg] Cycles in list structures are forbidden. img[circle.svg] ]/* Representation of s-expressions by list structures */ sec[Association lists][ Association lists, also known as property lists, are lists of a special form to represent symbols. The first c[car] field contains a special constant to tell the system that it deals with a symbol. The association list can contain further informations about the symbol like: ul[ - the print name - a numerical value if the symbol represents a number - another s-expression if the symbol serves as a name for it - the location of a routine if the symbol represents a function ] McCarthy describes in his paper only how print names are represented by association lists. The association list of the symbol c[differentiate] has a substructure of these form: img[pname.svg] Since LISP was developed on an IBM 704, which had a 36-bit architecture, the words in this example consists of six 6 bit characters. ]/* Association lists */ sec[Garbage collection][ Garbage collection is one of the most important concepts that were introduced within the paper. It is a form of automatic memory management and is used to free the memory from unnecessary list structures without any action of the programmer. img[gc.svg] cap[Fig][The pairs c[(c . nil)] and c[(4 . 8)] are garbage.] A so called free-storage list is used to catalogue the available free memory. The free-storage list starts with a register called c[free] and holds a pointer to the first free register in the list. If the program requires a register for a new list structure, the first register of the free-storage list is taken, and the c[free] pointer gets attached to the second register of the free-storage list. This mechanism is know today as dynamic memory allocation. To decide which registers are still in use and which are abandoned, the base register holds pointers to all accessible list structures. If a list structure is not accessible by a chain of c[car] and c[cdr] operations it is garbage. To free the memory from the unaccessible lists a mark-and-sweep algorithm is used. In the first phase of mark-and-sweep all accessible registers get marked. In the second phase all unmarked registers get appended to the end of the free-storage list and are reusable after that. In the third phase all marked registers get unmarked. The garbage collection mechanism gets only activated if the program runs out of free storage. ]/* Garbage collection */ sec[Other concepts][ McCarthy also describes a strategy for lazy evaluation of conditional expressions. Only the p’s and e’s that are required get computed. He also introduces the stack as a new concept. Through the use of stacks recursion is possible. The problem of recursion without a stack is that the program needs the same registers, that are also used by the previous level of recursion, to store the actual recursion result. This conflict is resolved by pushing and popping the results on a stack to store them temporary. ]/* Other concepts */ ]/* Memory management */ sec[Conclusion][ Before we evaluate the quality of LISP as a programming language, we are going to outline the development of LISP until today. After 1958, LISP became successful as AI language. Countless new LISP dialects were developed by universities and companies. Because LISP had efficiency issues the creation of special hardware, so called Lisp machines, was necessary. During the 1970, 1980s, and the early 1990s LISP had a very good period with a lot of funding from the US government, the US military, and the industry. LISP programming was a standard topic at some universities. In 1975, b[Scheme], one of the 3 major dialects that are still relevant today, was developed from Guy L. Steele and Gerald Jay Sussman at the MIT. The main characteristics of Scheme are: a very minimalistic syntax and core library, lexical scoping, and tail-call optimization. In 1984, b[Common Lisp], the second of the 3 dialects, was developed to standardize some earlier LISPs. Common Lisp is characterized by an inconsistent syntax, an huge number of libraries, and an extension for object oriented programming. In the mid-1990s the funding and the interest in AI research had dropped because of disappointment about the unsatisfying results in the AI field. During the "AI Winter" the popularity of LISP had decreased dramatically. LISP had a small "comeback" in the 2000s where a couple of new LISPs were developed. One of them was the third important dialect b[Clojure], which was developed from Rich Hickey in 2007. Clojure is built upon the JVM and supports some new features like multithreading and immutable data types. The most concepts from the McCarthy paper are still core concepts of recent LISPs. Over the years, LISP had also gained some new functionality and is today a multi-paradigm and general-purpose programming language. The DNA of almost every LISP can be described as: ul[ - LISP is b[functional], it supports higher-order functions, lambda functions and closures - It is b[dynamic], because of its dynamic type system and because it runs usually in an interactive programming environment called b[REPL] (Read-Evaluate-Print-Loop) - It has b[automatic memory management], because it uses garbage collection and automatic memory allocation - It supports meta programming with a very powerful b[macro system] - it is b[homoiconic], that means code and data have the same list syntax - b[Lists] are the major data structure - b[Recursion] is the most common way to express iteration ] q[It seems to me that there have been two really clean models of programming so far the C model and the Lisp model.][Paul Graham] /* Hopfully not misquoted */ LISP is in many ways different than other programming languages. On the one hand, it has these extreme minimalistic syntax, on the other hand, it has nearly endless expressive power. This means that LISP code can get created or changed directly through other LISP code. These expressiveness is accomplished by the homoiconicity and macros. The most famous example is the CLOS (Common Lisp Object System), which is a language extensions that makes object oriented programming under Common Lisp possible. An other example are DSLs (Domain Specific Languages), which are easy writable in LISP. Basically, LISP is able to adapt every functionality and every paradigm without changing the compiler or interpreter. For example, Clojure supports logic programming (see a[https://github.com/clojure/core.logic][core.logic on Github]) and pattern matching (see a[https://github.com/clojure/core.match][core.match on Github]) since 2011. q[Lisp isn't a language, it's a building material.][Alan Kay] Probably LISP was a few decades ahead of other programming languages. It influenced almost every programming language that followed later like Python, Ruby, or JavaScript. Some languages still came up with features that LISP already had 39 years ago. For example, Objective-C introduced blocks, also known as closures, in 2009 (see a[http://arstechnica.com/apple/2009/08/mac-os-x-10-6/10/][arstechnica.com]). A closure like concept called lambda expressions is also planed for Java 8 (see a[http://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html][Oracle]). But LISP is not as successful as one could expect from a language with all that superior functionality. If we compare the amount of LISP related questions with the amount of questions to other programming languages on a[http://stackoverflow.com][stackoverflow.com], we found that LISP is not very popular. The percentage of questions about the three major LISP dialects is approximately on the same low level under 0.3%. img[lisps.svg] cap[Fig.][Popularity of LISP dialects on a[http://stackoverflow.com][stackoverflow.com] (see a[http://hewgill.com/~greg/stackoverflow/stack_overflow/tags/#!clojure+lisp+common-lisp+scheme][source])][DIALECTS] How low this is becomes more obvious when we compare LISP with some mainstream languages. img[languages.svg] cap[Fig.][Popularity of LISP and other programming languages on a[http://stackoverflow.com][stackoverflow.com] (see a[http://hewgill.com/~greg/stackoverflow/stack_overflow/tags/#!java+javascript+python+objective-c+lisp][source])][LANGAGES] And even if we consider that functional programming languages are less frequently used than imperative programming languages, we must admit that it is rather unpopular. img[functional.svg] cap[Fig.][Popularity of LISP and other functional programming languages on a[http://stackoverflow.com][stackoverflow.com] (see a[http://hewgill.com/~greg/stackoverflow/stack_overflow/tags/#!scala+haskell+erlang+lisp][source])][FUNCTIONS] Now we are going to discuss briefly some of the possible reasons for this situation. h1[Fragmentation of the ecosystem] The already small LISP community is not only divided into 3 main dialects, the dialects itself are also divided into many different implementations and flavors. Some examples are: b[Scheme derivatives:] a[http://racket-lang.org][Racket], a[http://www.call-cc.org][CHICKEN], a[http://www.paulgraham.com/arc.html][Arc], a[http://gambitscheme.org/wiki/index.php/Main_Page][Gambit] b[Common Lisp derivatives:] a[http://www.franz.com/products/allegro-common-lisp][Allegro Common Lisp], a[http://www.lispworks.com/products/lispworks.html][LispWorks], a[http://www.clisp.org][CLISP], a[http://ccl.clozure.com][Clozure CL] The fragmentation is a huge problem, because there is no joint effort to improve the dialects or LISP as a whole. h1[Lack of successful applications] It appears as if a[https://news.ycombinator.com][Hacker News] is the only relevant LISP application nowadays. Hacker News is an minimalistic social news website, which ranks articles through a voting systems. Because of its simplicity it can not serve as a promotion for the capabilities of LISP, like Dropbox is a promotion for Python or Github is a promotion for Ruby. Therefore companies and developers see no economic or technical advantages in using LISP for their projects. h1[Lack of modern tools] The most commonly used development tools are often old and do not offer that comfort that modern tools for other languages do. Of course, a[http://www.gnu.org/software/emacs/][Emacs] is very sophisticated and offers every feature one could ever imagine, but it also seems very unfamiliar to some people, and it needs a lot of effort to learn to use it effectively. A positive exception is a[http://www.lighttable.com][LightTable], a modern IDE for Clojure. h1[Lack of jobs] There are worldwide almost no LISP programming jobs available. a[http://functionaljobs.com/jobs/search/?q=lisp][functionaljobs.com] lists at the moment 3 jobs and a[http://careers.stackoverflow.com/jobs?searchTerm=lisp][careers.stackoverflow.com] lists 4 jobs. So, job opportunities can not serve as a motivation to spend time for LISP development. /*h1[Other languages have caught up]*/ h1[Large entry barriers] We have seen in id[DIALECTS] that none of the major LISP dialects and even Clojure are able to attract new developers. The LISP approach is so different compared with imperative and object oriented languages that some people may have problems with that paradigm shift. Some people also complain that LISP code is hard to read with all the parentheses and recursion. It appears also to be a problem for newbies to find guidance for choosing a dialect and tools. Like LISP it self, a lot of LISP related content in the internet is very old and looks therefore outdated or abandoned. Even if they are not outdated or abandoned, many web sites leave a very poor first impression, like a[http://www.lispworks.com/products/lispworks.html][LispWorks] and a[http://racket-lang.org][Racket]. However, LISP is still a beautiful programming language with an elegant syntax and endless expressiveness. This is even more impressive if one considers that it was developed 55 years ago. ]/* Conclusion */ sec[References][ bib[ROOTS][Paul Graham, Roots of Lisp, 2002, a[http://ep.yimg.com/ty/cdn/paulgraham/jmc.ps][Source]] bib[LISP][John McCathy, Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I, Massachusetts Institute of Technology, Cambridge, Mass., 1960, a[http://www-formal.stanford.edu/jmc/recursive.html][Source]] bib[LAMBDA][Alonzo Church, The Calculi of Lambda-Conversion, Princeton University Press, Princeton, 1941] bib[IPL][Allen Newell and J. C. Shaw, Programming the logic theory machine, Los Angeles, 1957, a[http://www.textfiles.com/bitsavers/pdf/rand/ipl/P-954_Programming_The_Logic_Theory_Machine_Jan57.pdf][Source]] ]/* References] */ sec[Appendix][ code[lisp][; The Lisp defined in McCarthy's 1960 paper, translated into CL. ; Assumes only quote, atom, eq, cons, car, cdr, cond. ; Bug reports to lispcode@paulgraham.com. (defun null. (x) (eq x '())) (defun and. (x y) (cond (x (cond (y 't) ('t '()))) ('t '()))) (defun not. (x) (cond (x '()) ('t 't))) (defun append. (x y) (cond ((null. x) y) ('t (cons (car x) (append. (cdr x) y))))) (defun list. (x y) (cons x (cons y '()))) (defun pair. (x y) (cond ((and. (null. x) (null. y)) '()) ((and. (not. (atom x)) (not. (atom y))) (cons (list. (car x) (car y)) (pair. (cdr x) (cdr y)))))) (defun assoc. (x y) (cond ((eq (caar y) x) (cadar y)) ('t (assoc. x (cdr y))))) (defun eval. (e a) (cond ((atom e) (assoc. e a)) ((atom (car e)) (cond ((eq (car e) 'quote) (cadr e)) ((eq (car e) 'atom) (atom (eval. (cadr e) a))) ((eq (car e) 'eq) (eq (eval. (cadr e) a) (eval. (caddr e) a))) ((eq (car e) 'car) (car (eval. (cadr e) a))) ((eq (car e) 'cdr) (cdr (eval. (cadr e) a))) ((eq (car e) 'cons) (cons (eval. (cadr e) a) (eval. (caddr e) a))) ((eq (car e) 'cond) (evcon. (cdr e) a)) ('t (eval. (cons (assoc. (car e) a) (cdr e)) a)))) ((eq (caar e) 'label) (eval. (cons (caddar e) (cdr e)) (cons (list. (cadar e) (car e)) a))) ((eq (caar e) 'lambda) (eval. (caddar e) (append. (pair. (cadar e) (evlis. (cdr e) a)) a))))) (defun evcon. (c a) (cond ((eval. (caar c) a) (eval. (cadar c) a)) ('t (evcon. (cdr c) a)))) (defun evlis. (m a) (cond ((null. m) '()) ('t (cons (eval. (car m) a) (evlis. (cdr m) a))))) ] cap[Listing][c[eval] by Paul Graham, a[http://ep.yimg.com/ty/cdn/paulgraham/jmc.lisp][Source]][EVAL] ]/* Appendix */