raganwald
(This is a snapshot of my old weblog. New posts and selected republished essays can be found at raganwald.com.)

Saturday, December 16, 2006
  Lisp is not the last word


Ken Tilton asked: What is up the power continuum from Lisp?

I don’t have a ready answer. However, just because I don’t have an answer doesn’t mean I don’t believe there’s an answer. It could be that Lisp is a little like Democracy. It could be the least powerful programming language possible, excepting all of the others invented so far. But you know what? I have faith we can do better.

Ken doesn’t say there isn’t a language up the power continuum. And I won’t say we have already invented one: like Ken, I’ll pose a question: what law of computer science places a limit on the power continuum at Lisp?



G.J. Chaitin explains his proofs of Kurt Godel’s incompleteness theorem and Alan Turing’s “halting problem” in computation. Chaitin’s creative use of Lisp in mathematics and fervent belief that no theorem is proof against new analysis are welcome shots of espresso.
Human history is chock-a-block full of inventions and practices that were considered for decades or even centuries to be the final word, the ultimate expression and implementation of ideas. And then someone came along and demolished everything. Geocentricity. Heliocentricity. Newtonian celestial mechanics. Light as a wave. Light as a particle. Three dimensions. Uniform space. Euclidian geometry.

Some of these new ideas took years to take root while the establishment derided them as “not even wrong.” Others were so obviously right they immediately displaced what had come before. We might now have invented a more powerful language. Or we might have invented one but not realize it yet. But who can say that we haven’t invented a more powerful language and will never do so?

If you believe there is a power continuum, if you are not so obsessed with Turing Completeness and theoretical equivalence, what is the argument that it has any limit whatsoever, let alone that its limit is Lisp?

I believe that the only language that is affixed to the top of the power continuum is Blub. For everyone whose imagination soars above the ceiling of their laboratory, Lisp is not the last word.

Labels: ,

 

Comments on “Lisp is not the last word:
This essay neglects to address the crux of the matter, which the author may not have realized was the other shoe waiting to drop when I asked that question: Lisp may not have powerful new feature X (such as the automatic dataflow management provided by my pure-CL Cells package), but once we have done the Turing thing and built a dataflow engine using standard lisp, we can hide all the wiring with macrology and special variables and symbol macros etc and make it look as if Lisp was a dataflow language. Something like continuations would be an example of something CL cannot really do, but then we get to argue over whether those are a Good Thing. :)

In other words, I was not saying new ideas are not to be had, I was saying CL will probably be able to absorb them and provide them as if they were always part of the language, because CL is a meta-language, or as Foderaro said, a programmable programming language.

ken tilton
 
I've read Graham's article several times over the years, and your post prompted me to examine it more deeply.

We always think of the article as him claiming that Lisp is the most powerful language (which I'll concede he does). What I haven't realized until this morning though is that he really means that Lisp has the most powerful abstraction. In fact, the first two times he mentions "continuum," he associates it with "abstraction" rather than "power":

"Languages fall along a continuum of abstractness"
"Blub falls right in the middle of the abstractness continuum."

Programs are collections of data and behavior, and the code is how we express it. However Lisp treats code as data itself - the highest abstraction a language can make.

I think, anyway. Is there a higher abstraction than a language treating its own code as data? I don't think so.

He claims that languages may asymptotically approach Lisp, but if they implement all its key features than they're just another dialect of Lisp. I think that's inaccurate. Lisp is perhaps simply the sole member of another family of languages, which I'll tritely coin meta-oriented languages :) There's nothing to stop another language from joining that family, and indeed some - such as Ruby - are quite close.
 
The very word "continuum" pleasingly reminds me of Cantor's infinities. Perhaps there is, as Pat suggests, a family of similar programmable programming languages with Lisp as the wise elder.

I agree that homoiconity is the highest abstraction we know how to make. However, I am not at all sure that Lisp's expression of homiconity is the most abstract or most powerful expression possible.

In all Lisps, the syntax of code is data. The symbols and data structures of the program. However, programs are far, far more than their notation.

For example, languages with continuations (obviously including Lisps like Scheme) include an abstraction of programs that is not just the program itself.

I expect that there are many other elements of programs that could be reified to produce more powerful members of the "programmable programming languages" family.

I agree with Ken that we can make Lisp look like a language with continuations or dataflow or anything else. But if we call such variations "Common Lisp" we dilute the meaning of the name. Such languages may be new members of the Lisp family, and they may be built on top of Common Lisp, but I think that arguing they are Common Lisp is a little like arguing Ruby is C.

In short I argue that the 'meta-language' character of Lisp doesn't make it the most powerful language on the continuum, just possibly the most convenient for building more powerful languages.

I've argued (in support of languages like Lisp and Ruby) that the idiom for building idioms is itself an idiom. Who is to say that we won't invent better idioms for programming programmable programming languages?
 
Three quick thoughts, Reg:

1. "[A Blub programmer] doesn't realize he's looking up. What he sees are merely weird languages..."

Lisp programmers cannot be Blub programmers. To rephrase Tilton, what on Earth would a Lisp programmer think of as weird? I mean genuinely weird, not merely perverse like Befunge or Brainf*ck.

2. I reject the argument that "foo is part of CL because once you discover foo you can write it in CL". Being part of CL means you don't have to write it yourself in CL. New ideas must overcome existing idiom. This problem is orthogonal to the extensibility of a language.

3. Lisp is at the top of a continuum that is losing relevance daily because other dimensions outside that continuum are becoming more and more vital to the fitness of a language.

Learning Lisp is like climbing Mount Everest. Like Everest, once you get up there you can say, "I am at the highest point on the whole Earth!" But then you look around and realize that the only people up there with you are crazed hermits, enigmatic sherpas, and a lot of rich white tourists. Let's consider just the economic dimension here: the hermits are up there because they hate capitalism and have gone a bit funny. The sherpas are up there earning minimum wage because they love the mountain. And the rich white tourists? They're up there because they're already rich enough to afford to go, or because they know they'll be able to leverage the experience in their consulting gigs to make more money later.

There is a new continuum, one in which mastering multiple specialities is the critical first step to being able to integrate those specialities. Make Lisp work in a secure, auditable environment. Give it a nice GUI builder. Make it so even a CTO's nephew can pick it up and learn it. Call it L#.NET and get it into the hands of half a billion programmers.

Leaving the existing continuum: That's "up" from Lisp.
 
Oh, no, I would not say Cells (the dataflow hack) /is/ Common Lisp. I would call it... a dataflow hack, and merely brag about its transparency and Lispy-ness and integration with CLOS. Indeed, the lesson I learned from Cells is precisely not to achieve productivity by creating new languages, rather by creating powerful extensions to existing languages.

ken tilton
 
Just because you can reprogram LISP (macros) in itself doesn't mean that LISP is the top dog.

Let me rephrase this. LISP, as in the language I can download and work with (say, Common Lisp, or Scheme, or some such), is certainly not top dog. In fact it's not much different from many other languages I could name.

LISP, as in, the idea that a self-referential language that can be extended 'holistically' - in a way that the syntax itself gets an upgrade, is, obviously, by definition, the dog's bollocks. You can do, in theory, anything with it.

However, I contend that ALL LANGUAGES FUNDAMENTALLY WORK THIS WAY - because computers work this way.

That is, everything a computer can possibly do, can be programmed. Assuming your language offers the raw basics (e.g. turing completeness), you can construct anything you like. You can make a new extension to java which looks a lot like java but adds some minor tidbit and still compiles to .class.

The only reason that language is no longer java is because the exhaustive JLS specs say so. The notion that LISP 'the concept' (in the sense that there's no one language that is the one and only lisp) disallows such a thing doesn't exist, but why would that matter?

Any sufficiently tricked out LISP is indistinguishable from a totally new programming language. This is also obvious. It's fun that you can call this 'LISP' and get away with this, but beyond a fun semantic little detail, who cares? If it quacks like a dove, walks like a bear, and looks like a postbox, it's not a duck, even if theoretically it still fits the definition of one, for all practical intents and purposes.

Let me try another rephase: Using some fancy preprocessor statements I have seen pascal code compiled with plain gcc. Throw in a bunch of libraries and you can fake anything. Is this a good thing? Does this mean any language that chooses to abolish the preprocessor because it turns all code into slag that cannot be reasoned about and is a pain in the tusch to use is somehow 'inferior'? Is C (with preprocessor) equal to LISP in that you can theoretically 'macro' every trick you can possibly think of? Is PASCAL really a dialect of C because of this?

In a certain very specific sense, yes, to all those questions. But I don't give a rat's ass about that 'sense'. It's entirely in the domain of the theoretic. In theory, practice is just like the theory, but in practice, practice is nothing like the theory. Or so the theory goes.

Final point: These vague discussions are imo pointless. The real question is some sort of combination of productivity, support, capability, execution speed, 'fun of programming', maintainability, etc, etc, along a large continuum of project sizes, types, and situations.

The idea that you can trick out your own language is but a teensy tiny little blip on the radar, and in many cases is actually a bad thing (ANY java code can be sort of understood by anyone who purports to know about java, at least in the basic sense. This has caused some dumbing down of the language but on the flipside random LISP code may be totally and utterly non sequitur - not even offering the slightest hint. That's the flipside of a supposed all-powerful language). What's worth more? The macro facilities or the provably linear-time chasability of code identifiers and the like?

I do not know the answer to that last question. All I know is: I have never read a satisfactory answer to the dilemma, merely a boatload of silent assumptions that the expressive power must be better... somehow.
 
Reinier, your experience programming Lisp macros and C macros is much different than mine. I actually did not recognize either language from your description. Ken Tilton
 
If everything can be added to LISP, then it is just as fair to say that LISP has none of the features as to say that LISP has all of the features.

In other words: I would expect a language to give me guidance, useful idioms, ways to express what I want to say, and not just grammar. The power of a language, for me, is not to allow me to say things, but to help me say things succinctly, so I'll be understood. Language is a social thing, not something you make up on your own.

Others may feel differently: Creating your own language and your own conceptual world certainly gives you a feeling of power. But does that make for a powerful language? What power: Power over bytes, the power to solve problems, power in the marketplace, or power over people?
 
Ken Tilton: You hit the nail on the head.

IN THEORY you can make gcc do anything provided you have a library and you have a sack of preprocessor statements. Yes, even LISP, and thus, by extension, anything LISP can theoretically do.

IN THEORY only though - because as you said, in practice, #define is nothing like a LISP macro.

IN THEORY. This is my entire beef with the view that lisp is more powerful. I'm sure you can conjure up a theory that says: You can pull off stunt 'X' in LISP and you "can't" in Blub, and so LISP wins! - But that's not really true. It might be inconvenient in Blub, but you can coax Blub to do anything - this is the nature of computers and a side-effect of what we know as 'Turing completeness'.

Which means the real discussion is unfortunately much, MUCH more vague. As I said, it's a combination of convenience, expressiveness, maintainability, support, 'fun factor', and a bunch of other hard to peg aspects.

Does LISP win at THOSE? I don't think so at all. I can see Haskell taking the cake there for a certain type of brain.

Of course you can fake haskell in LISP with enough macros.... you see where this is going.

The notion of *easy* 'macroability' is just one amongst a large pool of things a programming language can have. The fact that it allows you to do it all is NOT the end of the discussion. In fact, there are tons of situations where I'd rather have a language with some restrictions to it. (note the *easy* - any language can be used or modified to support a feature that has all the benefits of macroability. Whether or not this is easy, common, or part of the language's "geist" is another issue entirely).

restrictions breed different types of power.

And that discussion, as I'm sure you've noticed by now, given the proclivity of the internet as a forum for this discussion over the past couple of decades, is a rather difficult one to settle. It's the old adage about Assholes and Opinions. Everything has at least one of both.

I'm not quite sure what it takes to make headway in the discussion, but throwing theoretics into it won't get you anywhere, I know that much for certain.
 
It seems to me that few, if any, participating in the language debate care to enumerate the idioms (aka aspects) that make one language "better" then another.

I really want to see a statement of a problem, and an example of how having that aspect is better suited to solving that problem. I do not expect full working code, just snippets and structure to show the major semantic differences. The Computer Language Shootout (http://shootout.alioth.debian.org/) is useless because some language aspects do not reveal themselves on such simple problems. Furthermore, CLS has a high numerical processing bias; which probably does not match your problem domain. If anyone knows of a site that does enumerate language idioms/aspects, please let me know.

CL, with it's macros, is one of the few languages that has demoted language semantics, idioms, or aspects to that of a code library. With this is mind, all languages are really compared based on your needs, and what libraries are available to solve those needs. It is easy to see why some consider Java a win over CL: Java has libraries that solve the bulk of the problems at hand. Dealing with new databases, file formats, protocols, thread pooling problems, transactional processing problems, even most numerical processing problems, all are solve by a Java library out there. My Java programs are mostly just a scaffolding to hold others' work together.

But I do miss macros. Instead, I have to live with Java Reflection API and my own heavyweight compiler/decompiler to achieve my ends. It is all behind a simple API, so although it is no CL, it is nowhere near the alternative: manual code expansion.

Here are some improvements I would like to see in CL:

Type Checked Macro Expansion: CL macros are simply s-expression concatenation with no type checking to prevent bone-head errors. I just saw ?Emacs? has a macro expander debugger, but it is a runtime tool. There are still many type checks that can be done at compile-time to verify macro sanity; albeit it requires more type declarations (or type analysis) to do these checks.

Better Compiler Optimizer: Macro's are good, but the high level code that leverages them is excruciatingly slow. More work needs to be put into optimizers so that high level languages, built from macros, are fast enough for production use.

Separation of Concerns: Macro's are good, but they are as limited as libraries in that they declare only one of many possible implementations. It would be good to separate the semantic effects of a macro from it's implementation so that experts can improve implementation without dealing with semantics. Truly, I do not even know if this is possible, if not a compiler.
 
You can't do continuations in Common Lisp without rewriting the reader, but you can do them in Haskell with a monad.

I'm still not sure what they're good for, though. And nice as Haskell is, there are things you can do efficiently in imperative languages that generally take a penalty in functional ones. So...

I'll argue that no language is best, but assembly is good for writing the first compiler in ;)
 
Problem is that we was going backward for a long time.

Next thing we will reinvent is a reflection. Yep, Lisp always had it. But who use it today?

Only SmallTalkers, i think. And LAMP'ers.
 
Two suggestions:

Alan Kay's new Albert project.

Frames/Contexts combined with a declarative Prolog-like language.
 
I object to the assumption that Lisp is the most powerful widely-known language.

Besides the fact that the vast majority of Lisp's features are trivial to implement in other programming lagnauges, Lisp itself is highly restricted in its capabilities.

Lisp doesn't give you the power to access the underlying operating system like C. Nor does it give you the abiltity to leverage advanced hardware capabilities like moving calculations to graphics cards. From what I have read, most Lisp implementations don't even handle multiple threads well.

Lisp has some neat tricks, but they are all confined into a little box. As soon as you want to reach out and do something besides manipulate in-process data, its true weakness shows.
 
; LISP has survived for 21 years because it is an approximate local optimum in the space of programming languages.
; -- John McCarthy (1980)

For me this quote describes the best what makes Lisp close to the last word.
 
More domain specific languages and systems can beat out Lisp.

If you are doing data analysis: SAS

If you are doing mathematical analysis and visualizations: Mathematica

But as a general purpose programming language, Lisp is pretty near the top of the heap.
 
Lisp doesn't give you the power to access the underlying operating system like C.

Nothing stops any language from producing machine code at runtime.

For thouse how do not understand what i am talking about there is a FFI.

But wait... OS in common lisp??? :)
 
Nothing stops any language from producing machine code at runtime.

Then nothing stops you from doing anything in another language that you can do in LISP.
 
Then nothing stops you from doing anything in another language that you can do in LISP.

Sure, everything could be done in turing-complete language.

Feature of Lisp is that it allows you to built language you need easily.
 
> You can't do continuations in Common Lisp without rewriting the reader, but you can do them in Haskell with a monad.

Continuations don't require syntax in lisp, so it's incorrect to say that one can't do continuations in Common Lisp without rewriting the reader.

Efficient continutations do require some implementation support that isn't required by the Common Lisp standard, but that's a separate issue, as there are other lisps. Many of the ones that have good support for continuations are known as "schemes".

Note that one can read scheme with a Common Lisp reader. One can even write "Common Lisp" that uses the relevant operations; it's just that those operations won't necessarily work as well as one might like.

It's wrong to say that continuations require rewriting the reader.
 
"Learning Lisp is like climbing Mount Everest...."

Great analogy!
Cited in analogies and metaphors.
 




<< Home
Reg Braithwaite


Recent Writing
Homoiconic Technical Writing / raganwald.posterous.com

Books
What I‘ve Learned From Failure / Kestrels, Quirky Birds, and Hopeless Egocentricity

Share
rewrite_rails / andand / unfold.rb / string_to_proc.rb / dsl_and_let.rb / comprehension.rb / lazy_lists.rb

Beauty
IS-STRICTLY-EQUIVALENT-TO-A / Spaghetti-Western Coding / Golf is a good program spoiled / Programming conventions as signals / Not all functions should be object methods

The Not So Big Software Design / Writing programs for people to read / Why Why Functional Programming Matters Matters / But Y would I want to do a thing like this?

Work
The single most important thing you must do to improve your programming career / The Naïve Approach to Hiring People / No Disrespect / Take control of your interview / Three tips for getting a job through a recruiter / My favourite interview question

Management
Exception Handling in Software Development / What if powerful languages and idioms only work for small teams? / Bricks / Which theory fits the evidence? / Still failing, still learning / What I’ve learned from failure

Notation
The unary ampersand in Ruby / (1..100).inject(&:+) / The challenge of teaching yourself a programming language / The significance of the meta-circular interpreter / Block-Structured Javascript / Haskell, Ruby and Infinity / Closures and Higher-Order Functions

Opinion
Why Apple is more expensive than Amazon / Why we are the biggest obstacles to our own growth / Is software the documentation of business process mistakes? / We have lost control of the apparatus / What I’ve Learned From Sales I, II, III

Whimsey
The Narcissism of Small Code Differences / Billy Martin’s Technique for Managing his Manager / Three stories about The Tao / Programming Language Stories / Why You Need a Degree to Work For BigCo

History
06/04 / 07/04 / 08/04 / 09/04 / 10/04 / 11/04 / 12/04 / 01/05 / 02/05 / 03/05 / 04/05 / 06/05 / 07/05 / 08/05 / 09/05 / 10/05 / 11/05 / 01/06 / 02/06 / 03/06 / 04/06 / 05/06 / 06/06 / 07/06 / 08/06 / 09/06 / 10/06 / 11/06 / 12/06 / 01/07 / 02/07 / 03/07 / 04/07 / 05/07 / 06/07 / 07/07 / 08/07 / 09/07 / 10/07 / 11/07 / 12/07 / 01/08 / 02/08 / 03/08 / 04/08 / 05/08 / 06/08 / 07/08 /