raganwald
Tuesday, January 03, 2006
  Finding the Signal-to-Noise Ratio in the Never-Ending Language Debate
Once again the web is buzzing with one of the most boring debates ever, namely "what's the best programming language?" I will not rehash the arguments. I'll try to restrain myself to stating my position, giving my reasons, and then toss in speculation on why a certain web framework is growing like wildfire.

My position is simple: the language operating at the highest level of abstraction while not suffering from project-killing defects is the one I want to use. My algorithm is simple: throw away languages that will scuttle my project, and pick the highest level language out of those that remain. (I'm not going to discuss defects. They are a matter of taste and lead us down an unhappy path.)

I believe that programming is an idiomatic activity. We learn idioms and then apply a kind of pattern-matching to recognise problems that can be solved with an idiom we already know. Some idioms are easier to express than others in each programming language. (Does the Weak Sapir-Whorf Hypothesis apply to programming?)

Sure, all languages are Turing Complete and therefore equivalent in theory, but in practice you might find that the only way to express an idiom from Language A in Language B is to write a A->B compiler or interpreter in B.
Greenspun's Tenth Rule of Programming: "Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp."
Does that sound too academic? Consider this: how would you write a continuation-based framework (like Seaside) in Java?

This, incidentally, is why the language debate will never end. Programmers naturally learn and know the idioms afforded by their language. So when you say "In my language I can very easily do X and Y," the programmers using other languages always say "But nobody needs X or Y. We're busy doing Z."

So let's talk about idioms instead of so-called features like syntax. What does it mean for one language to be "higher level" than another? To have a "higher level of abstraction"? My own interpretation is simple:

One language is higher-level than another if it expresses more idioms in fewer bits than the other language. Or fewer lines of code. Or fewer symbols. There's a really simple language idiom for this: Programs in a higher-level language have a high signal to noise ratio.

For example, there are these "design patterns" that all of the COBOL^H^H^H^H^H Java programmers like to use. One of these is called Visitor: "One key advantage of using the Visitor Pattern is that adding new operations to perform upon your data structure is very easy. All you have to do is create a new Visitor and define the operation there. This is the result of the very distinct separation of variant and invariant behavior in the Visitor pattern. The invariant behaviors are represented by the data structure elements and the abstract Visitor. The variant behaviors are encapsulated in the concrete Visitors."

Got that? How many lines of code do you need in Java? Here's how you express the Java idiom of the Visitor pattern in Ruby:

concrete.each { |thingy| visitor.visit thingy }

Do you think I'm making this thing about brevity up?

Here's something I found this morning. This fellow is explaining why interviewees can use any language they like to solve a problem, provided it's black:
When I interview someone, I usually let them use the language of their choice with between C, C++, C# and Java... It's not that I have something against Ruby or other fourth generation languages, but I found that these languages work at a level of abstraction that is a little too high to give me a meaningful insight on the candidate for a forty-five minute interview. There are plenty of very interesting problems that are hard to solve even in Ruby or Python (and actually, it's quite likely this is the kind of problem that the candidate will be struggling with if she gets hired), but formulating the question and writing the solution would take several hours.
The Language of Interviews
Let me see if I can condense this: there is a class of problems that are non-trivial to solve in C# and Java but are trivial in Ruby and Python. Hot damn, I could have skipped writing this whole thing and simply posted the quote.

(By the way, do you buy that something non-trivial to solve in Ruby must necessarily require hours to explain and to write out? Neither do I. I am not bashing the post here. That point is really a very small aside. But I don't have any trouble thinking of Ruby, Python, or Lisp problems that would require only a few minutes to explain and would be appropriate for an interview.)

Now, there's some idea that you can build your own idioms in any programming language. We have various names for the technique (like procedures, subroutines, objects, modules, and macros), but the argument is that you can build up a library or framework of idioms and then you can ultimately express yourself with startling brevity.

So some state that all languages are equal because if something is difficult to express you build your new idioms into the language and presto, you have a high signal-to-noise ratio in your programs.

This argument has a major, glaring, flaw: the mechanism for building idioms in each language is an idiom itself, and it tends to be the most rigidly limiting idiom on the language. C++ is a language for defining abstract data types that look just like built in types. In SmallTalk everything is an object that belongs to a class. And so on. Let's call this the language's "meta-idiom."

The argument about idioms applies to meta-idioms. So if you want to build new idioms in a language, you need to use its meta-idioms. Some languages provide higher signal-to-noise on meta-idioms than others. Some languages make it easier to construct your own idioms than others. So even though all languages may provide some mechanism for raising the signal-to-noise ratio, not all are equal when it comes to using their mechanisms.

So, I've written that I believe that not all languages are equal. I've written that I believe that an important metric for comparing languages is the signal-to-noise ratio, as measured by the brevity of expressing idioms. Now I will conclude by revisiting the affordances argument.

Some languages make it easier to express some idioms than others. How? See the paragraph above about signal-to-noise ratio. Given that some things are easier than others in each language, wouldn't you expect that there will be a very few languages that can express an order of magnitude more idioms in an order of magnitude less code than the 'average' language?

(If you don't think so, why not? Doesn't practically everything you've ever seen operate on a power law distribution? Why should programming languages be different?)

So, I expect that some languages will do a better job of expressing a larger set of idioms than others. Now, here's the key question: does the number of idioms that a language affords matter?

I diverge from the wise men who write essays on this topic. I think it matters to me, but it is unlikely to sway anyone's decision. My simple observation is that programmers develop and use idioms that are easy to express in their language. They 'cut with the grain.'

When considering one language over another, my first consideration, naturally, is with the idioms I already know. Does the new language help? This consideration is exactly like almost any other "upgrade" consideration. When considering an upgrade, the first thing you want to do is solve an existing problem.
When I purchased an iMac for my home, I had no need for playing DVD's. So I was very close to buying the 17" model. Luckily, I purchased the 20" model so I would have more desktop for programming. I already had a 17" monitor on my development box, and it was starting to feel cramped. Now that I have the 20" iMac, I love watching DVD's on it and I'm very happy I didn't settle for the model that seemed to suit what I thought I would need.
Once you're satisfied that the upgrade meets your needs and solves a problem, you go ahead. Later on, you'll discover a whole new world of features. These grow on you, until one day you ask yourself how you ever lived without them.

And so it is with programming languages. I can write about meta-programming in Ruby and Scheme for six more posts, but if you aren't meta-programming now, my guess is that you won't switch. You simply don't think in terms of that idiom, so you perceive it to be a "nice to have," but not essential.

What will get you to move up the ladder is when someone shows you how to solve an existing problem in fewer bits. This, you can recognise. Did I say signal-to-noise? Imagine your favourite music on AM radio in your grandparents' car. Now imagine you hear it on a beautiful stereo. More signal! Less noise!

But you aren't going to switch because someone says "if you think listening to music is fantastic, just wait until you hook up a keyboard and use GarageBand to compose music..." That simply doesn't mean anything to you until you've already bought your new iMac.

So although I fervently believe that a better language will allow you to express more idioms than you are currently using, I know that you are unlikely to switch for that reason. So I'm not going to write about how cool it is that someone could add "acts_as_versioned" to the Rails framework. You probably only care about whether "acts_as_versioned" saves you a ton of boilerplate versioning data in your application.

Speaking of Rails, I'm going to conclude with my take on one reason why Rails is taking off and Seaside is not. Rails allows programmers to express the idioms they already know (relational databases, web-backed MVC, stateless event handling plus a global session store) in fewer bits.

Seaside provides a whole new idiom, continuations, that IMO is more powerful. I think you end up with an even higher signal-to-noise ratio with a Seaside app than with a Rails app. Why? Because continuations afford you a much higher degree of controller reuse.

Now, here's the catch: if you try to imagine your current application running on both Rails and on Seaside, you probably won't see much difference between the two (although they'll both be an order of magnitude better than ASP.NET). They will look the same because you designed your application with idioms that both Rails and Seaside support.

To get a big win, you'd have to rethink your application's flow and logic. You'd have to "think in Seaside." And you're not going to do that. So you pick Rails, like so many others have picked it, because it looks just like your ASP app, only all the noise has gone away. It's all signal, baby.

Now, do you see the thing here? Ruby has been around for a while, but nobody switched. Why? Because they couldn't see how its new idioms would help them. Every time a programmer considered switching from Blub to Ruby, she was presented with all these new idioms, like dynamic types. None of this meant anything to her, because they didn't appear to make the idioms she already knew more compact. No win.

But now she looks and she sees her existing web idioms, and she sees them expressed with way fewer bits, and she is suddenly prepared to learn this Ruby thing if it will let her say:

class Language < ActiveRecord::Base
has_and_belongs_to_many :idioms
end

Happy New Year!

Labels: ,

 

Comments on “Finding the Signal-to-Noise Ratio in the Never-Ending Language Debate:
self-promotion on reddit pays :-P

I really enjoyed your article, but you could have made three from it:

* What's trivial where?
* It's all about meta-idioms
* You won't switch for a new idiom

Nevertheless you now have a place in my feedreader. Happy New Year!
 
Thanks for the kind words. I would have liked to include all three ideas but do so in 1/3 the space.

Obviously, *I* lack the necessary signal-to-noise ratio to be a "higher level" blogger.

p.s. I realize self-promotion is frowned upon in some circles yet welcomed in others. I included a comment on reddit to make sure that I was not misrepresenting my role.
 
Doesn't the definition of signal and noise depend on who you are talking to?

I don't understand 1 line of your Ruby code. It's all noise to me. I have to learn to read the signal. And some point cleaning up the noise from another signal is easier than learning to make sense of a new noise that says the same thing.
 
GP, you're mixing two different arguments.

I think you're actually making a valid point about ROI: you need to make an investment to learn a new anything before it pays back. So a marginal improvement in signal-to-noise might not justify a large investment in rewiring your brain.

But just because you choose not to decode the signal doesn't make it noise: the signal is the minimum set of bits that carry information, and the rest is noise. This is an objective reality, as I can prove by executing the code in irb.
 
I am dissappointed. You said not a word of business objectives and their relationship to a) getting paid and b) producing artefacts by programming.
 
"You said not a word of..."

Remarkably true. And these are words that ought to be said. Thanks for the suggestion for future posts.
 
"Frandlesizzle" - which in Giampieroian means:

On perfectly technical merits, you are right that I am mixing my arguments.
 
Business objectives: I went to a presentation recently about code generation. The guy said, if we're both consultants, and I'm using code generation and you're not, I'm going to kick your tail, because I'll be more productive. If we're both employees, I'm going to provide better value to my employer, same reason.

Using language abstractions that get rid of repetitive code has the same effect.
 
Here's an interesting item: the exercises from the first six chapters of "Thinking in Java" done in both Lisp and Java. You can easily compare the signal-to-noise ratio for both languages.

My suggestion is that if ou think one of the twois incomprehensible but clearly carries less noise, learn that language.

http://bc.tech.coop/blog/040430.html
 
I read up on Ruby. It's a nice language, no doubt. I can finally read that code that you wrote.

concrete.each { |thingy| visitor.visit(thingy)} #I'm a sucker for parentheses

which would be the following equivalent in C#

foreach (T thingy in conrete)
{
visitor.visit(thingy);
}

That wrapped up single line of code is pretty sexy compared to the whole foreach construct.

I concede my original point. It was off base. The point was supposed to be more like "the 'best' language isn't always the 'better' language" or something like that.

But thanks for making me look this up. Very interesting. I also read about erlang because of you. That was less pleasant than Ruby.
 
...the mechanism for building idioms in each language is an idiom itself, and it tends to be the most rigidly limiting idiom on the language.

I was bracing myself for the next sentence, which I fully expected to be something like this: "Lisp is the only language where the mechanism for building idioms is itself the dominant idiom."

Have I been reading too much Paul Graham, or is that something you thought of as well, but didn't include because you knew this sort of thing causes most audiences to say "oh, it's gonna be that kind of article" and leave?
 
GP:

What you say is true, but only expresses part of the whole truth. The Java expression of the visitor pattern also requires a whole lot of other scaffolding.

I may expand on this in a more technical post... one day...

Some idiot:

Unfortunate choice of handle, your comment does not strike me as idiotic :-)

I have certainly heard the argument that building idioms is the dominant idiom in Lisp. I do not buy this argument in theory, but I do buy it in practice.

In theory, high level languages ought to be used to build enough abstractions and idioms that you can express the core logic of an application in very few lines.

In practice, most people just use "off the shelf" and "design pattern" idioms in most Blub languages and don't bother creating new ones. (The big exception is the field of framework creation).

While Lisp and Scheme afford a wider variety of meta-idioms, I wonder if there is a cultural bias at play?

If enough people write essays saying "Lisp is about meta-idioms" and enough corporations mandate that "Blub is about writing code that can be outsourced to people who can't learn new idioms," will self-reinforcing cultures grow around the languages?
 
Mea Culpa:

My comment about another post strayed from discussing the words into speculation. I have struck out the offending phrase.
 
I think self-reinforcing cultures are already in place, at least as far as Lisp/non-Lisp dichotomy is concerned. Lispers look down on everyone else as intellectually inferior or at best uninformed, and non-Lispers see Lispers as pompous ineffectual ivory-tower jerks. Neither of the camps is very interested in what the other one is up to; both evolve separately.

I kind of like it this way, to be honest. The alternative seems to imply the need for compromises, and that is the best way to kill new ideas. Only natural selection can guide evolution of programming, and certainly not contrived idealistic cooperation.
 
FWIW, I use Seaside and the reason I prefer Seaside is fairly well summed up here.

For me, its all about the environment.
 
Hi Reginald, i liked you article ... the signal-to-noise ratio was a good abstraction to the discussion.

But i think here:

I think you're actually making a valid point about ROI: you need to make an investment to learn a new anything before it pays back. So a marginal improvement in signal-to-noise might not justify a large investment in rewiring your brain.

you have contradicted yourself when you said that about Seaside:

To get a big win, you'd have to rethink your application's flow and logic. You'd have to "think in Seaside." And you're not going to do that. So you pick Rails, like so many others have picked it, because it looks just like your ASP app, only all the noise has gone away. It's all signal, baby.

Putting basically i think that Seaside provides another level of abstraction dealing with the
limitations
of http stateless protocol with continuations and fomenting reuse. There many other gains on using it (on the OO paradigm) but there is no space here for it.

And that's why i think you contradicted yourself, as you said: But just because you choose not to decode the signal doesn't make it noise

Actually i work with Python and Java, and only played with RoR and Seaside so please correct me if i'm wrong.

So, all this let me thinking that the signal depends on the receiver too (people) not only on emitter (the 'bits'), correct?

Cheers!
 
Paulo:

I don't think there's a contradiction when I say you need a good ROI and when I say most people won't rewrite their applications to take advantage of Seaside.

My point about ROI is that you should consider it before switching. It may not be worth switching between, say, Groovy and Nice. It might be worth switching from Visual Basic to Seaside.

Now about not switching to Seaside. I may not have made myself clear: I didn't say there wasn't a return switching to Seaside, only that most people won't perceive the return because they haven't embraced the idioms it affords, namely Smalltalk's elegance, the Smalltalk development environment, and a continuation based framework.

So they see a big investment, but can't see a big return. Rails looks like a bigger return to the bulk of the market because it's seasier to see the reduction in noise for existing idioms.

Now as to your last point. In the discussion with GP, my point was that just because you can't understand something, that doesn't mean it's noise.

However, it's possible that you understand something but don't value it. I suggest that for most programmers, they can learn what Seaside is expressing, but they don't appreciate how valuable it would be if they tried it.

The problem is that it has good signal-to-noise, but it's signalling an idiom that is not in the "popular" approaches.

I was trying hard to distinguish between the signal-to-noise issue and the idiom vocabulary issue. I see I could have done a better job on that.

Thanks for your comment!
 
An interesting article on applying continuations to web development:

http://www-128.ibm.com/developerworks/library/j-contin.html

My understanding is that this is one of the (but not the only!) key ideas behind Seaside.
 
After years of developing software, I can tell you that the choice of programming language is secondary to the choice of IDE. A good or great IDE allows me to spend more time on the most important thing - namely solving business issues. A bad or average IDE forces me to spend more time on unimportant things - debugging and writing code.

Say what you will about the Basic language, but I still maintain that, to this day, the Visual Basic (Studio) 6.0 is the best programming IDE ever created. The ability to step into your code at any moment and inspect any value or object yields huge productivity gains. The VB 6 IDE offers "Intellisense" on the Basic language AND any objects you define and create - another big productivity gain.

I've used many IDE's in my career and I've never run across an IDE that is in the same ballpark with Visual Studio 6 in it's overall simplicity, productivity and power.

-msanders (another idiot)
 
"After years of developing software, I can tell you that the choice of programming language is secondary to the choice of IDE."

Would you be interested to know that you have much in common with the Smalltalk community?

The SmallTalk development environment, the language, and the choice to build SmallTalk in SmallTalk are all facets of this diamond.

http://www.cincomsmalltalk.com/userblogs/avi/blogView?showComments=true&entry=3284695382

If you do a little googling, you will come across lots of information about SmallTalk's refactoring browser, live debugger, and other mature tools.

Not that anyone would ever want to switch from VB (see the main body of the post). I'm just pointing out that there is a lot of support for the thesis that the quality of tools is more important than the potential of the language.
 
Lines of Code (LOC) is proportional to the probability of error and programmer productivity. Finding a language of brevity is then directly related to the cost and schedule as well as potential success. Ruby currently floats to the top in this category. Why would a project not consider this? The biggest argument against it is the switching cost, which exists with any legacy system.

There are a couple of other points that make it more powerful the fact that Ruby supports full reflection. Other languages to date only cover portions of full reflection.

The duck typing is the most interesting aspect of the language. Formal typing like Haskell is everything if I'm doing work for the NSA, but if I slinging web pages or some corporation, it isn't needed--testing and QA is what's needed. So why half-ass the type system like C/C++/Java did?

In the eighties the promise of 4th generational languages (4GLs) was all the buzz. It fizzled like most of the buzz of the last 4 decades of computing. Ruby actually delivers all the promises made about 4GLs.

On top of all that, it just works.
 
choosing a programming language because of signal-to-noise ratio is like picking woodworking tools based on the prettiness of rubberized grip handles.

--qwe1234 on programming.reddit.com
 
SNR is the amount of boilerplate cdoe to actual code neaded in a more powerful languige
 




<< Home
Reg Braithwaite


Recent Writing
Homoiconic Technical Writing / raganwald.posterous.com

Books
What I‘ve Learned From Failure / Kestrels, Quirky Birds, and Hopeless Egocentricity

Share
rewrite_rails / andand / unfold.rb / string_to_proc.rb / dsl_and_let.rb / comprehension.rb / lazy_lists.rb

Beauty
IS-STRICTLY-EQUIVALENT-TO-A / Spaghetti-Western Coding / Golf is a good program spoiled / Programming conventions as signals / Not all functions should be object methods

The Not So Big Software Design / Writing programs for people to read / Why Why Functional Programming Matters Matters / But Y would I want to do a thing like this?

Work
The single most important thing you must do to improve your programming career / The Naïve Approach to Hiring People / No Disrespect / Take control of your interview / Three tips for getting a job through a recruiter / My favourite interview question

Management
Exception Handling in Software Development / What if powerful languages and idioms only work for small teams? / Bricks / Which theory fits the evidence? / Still failing, still learning / What I’ve learned from failure

Notation
The unary ampersand in Ruby / (1..100).inject(&:+) / The challenge of teaching yourself a programming language / The significance of the meta-circular interpreter / Block-Structured Javascript / Haskell, Ruby and Infinity / Closures and Higher-Order Functions

Opinion
Why Apple is more expensive than Amazon / Why we are the biggest obstacles to our own growth / Is software the documentation of business process mistakes? / We have lost control of the apparatus / What I’ve Learned From Sales I, II, III

Whimsey
The Narcissism of Small Code Differences / Billy Martin’s Technique for Managing his Manager / Three stories about The Tao / Programming Language Stories / Why You Need a Degree to Work For BigCo

History
06/04 / 07/04 / 08/04 / 09/04 / 10/04 / 11/04 / 12/04 / 01/05 / 02/05 / 03/05 / 04/05 / 06/05 / 07/05 / 08/05 / 09/05 / 10/05 / 11/05 / 01/06 / 02/06 / 03/06 / 04/06 / 05/06 / 06/06 / 07/06 / 08/06 / 09/06 / 10/06 / 11/06 / 12/06 / 01/07 / 02/07 / 03/07 / 04/07 / 05/07 / 06/07 / 07/07 / 08/07 / 09/07 / 10/07 / 11/07 / 12/07 / 01/08 / 02/08 / 03/08 / 04/08 / 05/08 / 06/08 / 07/08 /