raganwald
Wednesday, October 26, 2005
  I'm not young enough to know everything
Paul Graham has alluded to the difficulty of doing a startup when you're, well, not young. He puts the upper bound at thirty-eight:
So who should start a startup? Someone who is a good hacker, between about 23 and 38... I don't think many people have the physical stamina much past that age. I used to work till 2:00 or 3:00 AM every night, seven days a week. I don't know if I could do that now. Also, startups are a big risk financially. If you try something that blows up and leaves you broke at 26, big deal; a lot of 26 year olds are broke. By 38 you can't take so many risks-- especially if you have kids.
Paul Graham, How to Start a Startup
While what he says is true, lately I've become aware of an even bigger threat to starting a startup. Experience can be a handicap.

Let's start with a digression. What makes weblog posts popular? Most people say things like "insightful" posts become popular. Or "well-written" posts become popular. Adjectives like that are tautological: if someone likes a piece of writing, they generally will find nice adjectives to apply to it.

One model for popular writing is that it panders to the reader's prejudices. Plain and simple. People like writing that validates them and especially their ideas. I'm no different, and as a result I tend to focus my research on things that I already think I know.

When you're twenty-two this isn't much of a problem because you know you don't know. You're "consciously incompetent." So you're far more likely to find something unfamiliar and try to understand it, to change your way of thinking to match what you learn rather than applying a "bozo filter" to it in advance.

But at forty-two (or three!), it's easy to think you know things. You're at incredible risk of thinking you know things when you've achieved some measure of success, no matter how modest. You become "unconsciously incompetent." You don't know, but you don't know you don't know.

I was personally reminded of this at startup school. As Chris Sacca pointed out:
The glow of screens (from a refreshingly Powerbook-dominated audience) revealed an array of real-time collaborative note-taking for a virtually assembling the room's minds in a concurrent recording and discussion of the event.
Chris Sacca, Startup School - An Inspiring Room Full of Hackers

Sitting in that room, I was hyper-aware that I was no longer in Kansas. It struck me that if I didn't want the next generation of hackers to wipe me out like a Tsunami, I needed to stop paddling, climb on my surfboard, and accept the risk of being dashed on the rocks.

I immediately made a commitment to myself to let go of things I used to think I knew.

Matz, Jonathan Ives, and A Narrow Road to a Far Province

One commitment was to try Ruby on Rails instead of Lisp. Every time I've looked at Ruby, I've thought "nice, but it doesn't do anything I couldn't do in SmallTalk back in 1980."

I've been the most pompous hypocrite. I mean, I often poke my Windows apologist friends and tell them that efficiency and user interface is not measured with check boxes ("mouse? check. icons? check. well, they must do the same thing.") I've extolled the virtues of design, of the interaction between the parts, of the frictionless user experience of a Macintosh.
Do not seek to follow in the footsteps of the wise. Seek what they sought.
Matsuo Basho
Ok, Ruby's blocks and classes and closures are the same things that Lisp has given us since 1959 and SmallTalk has given us since 1980. But maybe... Maybe... Maybe Ruby on Rails is easier to use than Lisp in the very same sense that OS X is easier to use than Windows.

As for the Rails thing, it's awfully popular and I have been uneasy about anything so hype-ridden. But how is that different from any of a thousand funny quotes deriding the new new thing? I'm tempted to say that there's a world market for maybe five Rails applications. But maybe there's more to it.

Maybe I should find out.

Thomas Bayes, Joel Spolsky, and Richard Feynman

Joel Spolsky dropped an incredibly provocative anecdote into one of his well-written and insightful posts:
A very senior Microsoft developer who moved to Google told me that Google works and thinks at a higher level of abstraction than Microsoft. "Google uses Bayesian filtering the way Microsoft uses the if statement," he said. That's true. Google also uses full-text-search-of-the-entire-Internet the way Microsoft uses little tables that list what error IDs correspond to which help text. Look at how Google does spell checking: it's not based on dictionaries; it's based on word usage statistics of the entire Internet, which is why Google knows how to correct my name, misspelled, and Microsoft Word doesn't.

If Microsoft doesn't shed this habit of "thinking in if statements" they're only going to fall further behind.
Joel Spolsky
I have an entire career built on top of experience building applications that manage ontologies, that are built out of objects and classes and tables and relations and all sorts of things that boil down to if statements. Or at least, they boil down to classifying things in advance rather than building systems that learn over time.

I've known about Bayesian classification for years. And I've always thought of it as a specialized tool. It's incredibly disruptive to think of it as an every-day tool, as a general-purpose tool, as something that can replace the if statement.
"Your old-fashioned ideas are no damn good!"
Richard Feynman
Yet when I step out of my comfort zone, I realize that I've seen this before (experience can be handy at times). Many of you young-uns have never known a programming world where there was no polymorphism (although judging by some of the code that has caused me to say "WTF?", not as many as I would like). Before messages, virtual functions, and method calls there were the switches, cases, and if statements.

There was an "aha!" moment for me when I suddenly grokked polymorphism. When I understood that switch statements were junk. Maybe Bayesian inferences can change programming the same way that polymorphism changed programming.

I need to find out.

And now I'd like to quote the other Steve, the one who isn't a hacker and wasn't presenting at startup school (Psst! Steve! Do you want to sell coloured MP3 players for the rest of your life or do you want to change the world again?)

One more thing

For a very long time I've been carrying a conjecture around. Paul Graham supplied me with the framework for thinking about the conjecture:
Treating a startup idea as a question changes what you're looking for. If an idea is a blueprint, it has to be right. But if it's a question, it can be wrong, so long as it's wrong in a way that leads to more ideas.
Paul Graham, Ideas for Startups
The "Graham Question" is: Can we predict the future of a software development project with objective observation?

Let's take a simple example, one that ran through my head this morning. I've worked for several companies that used issue tracking systems. These systems have had a little widget/enumeration for declaring the priority of an issue. I've also worked with Scrum-like teams that maintained priority with a master list or backlog.

You want to know which issues will be fixed/implemented/done by a certain date. What is the significance of the priority?

Well, this is management 101. Start with the number of hours available for development, then take the highest priority issues and estimate how much time is required to do them. Your prediction is that the team will do as many as possible of the highest priority items in the time available.

All well and good, but in reality "Spolsky" isn't in the dictionary and neither is "Braithwaite." For that matter, neither is "p.s.", and why should I have to click "add to dictionary" after the program has watched me type this thing and not correct it hundreds of times?

And lo, if we watch an actual software project we see that over time priorities change and new issues are added to the mix and sometimes low priority items are done first for some reason, and humans just can't seem to follow the damn plan, but software emerges from the other end anyways.

It's easy to say, "your old-fashioned priority is no damn good." But maybe we are not young enough to know everything. Maybe we should ask a question: "what good is the priority if you can't construct a nice if statement around it?"

Maybe this is like Spolsky and Braithwaite and Error IDs and Help Text. Maybe there is no formula up front but we can watch what people do hundreds of times.

Maybe Thomas Bayes knows the significance of the priority.

Labels: ,

 

Comments on “I'm not young enough to know everything:
38 tops for doing a startup? Ageist BS. I know plenty of men in their forties doing a startup. This includes myself.
 
I want to be Paul Graham when I grow up. The fact that I'm older than him isn't deterring me a bit.
 
beef jezos: I'd imagine that's meant in the way that there's an age at which a woman becomes 'too old' to give birth. That's not to say it's not going to happen, nor to say that it's necessarily going to fail. But it does mean that it's _more likely_ to fail.

I think it should also be noted that Paul Graham writes almost exclusively about software-based startups, not necessarily startups of all types.

I would also like to note that I'm just pointing out how I read the Paul Graham stuff, not necessarily state my own opinions.
 
I'm just about 44, and I'm going to show that ageist bastard Paul Graham who's not too old to do a successful startup, just as soon as I finish my mid-afternoon nap.
 
I enjoyed this article a lot, both today and the first time I read it... both times the title (and my 35 year old lack of memory) managed to mislead me into thinking that your conclusion would be the opposite.

I don't disagree with your suggestion that the older you are the less likely startup success is in the cards. For me the issue of financial stability and having children is a major factor that affects me. Nevertheless, I keep trying. (To have a successful startup, not have more children ;-)

But where I disagree very very much is on the whole notion that supposedly 22 year olds know they don't know much. Perhaps in theory they know somewhere in their intellect that such is the case. But in practice, my personal experience over and over again is that "the more I know, the more I know I don't know".

In fact this has been such a dominant theme of my life, and has been my observations with so very many young whippersnappers over recent years, that it has to become more and more of a conscious effort to embrace uncertainty. As time goes on there is so much more, not less, uncertainty.

I forever find out that I'm dumber than I thought I was and my patterns of thought grow more used to accepting that concept as a factor in future planning.

Perhaps these ideas are less common in other people. But I sort of doubt it. The college idealist is the stuff of legends and basically they are the quintessential embodiment of "a little knowledge can be dangerous". Another lovely example is my 11 year old daughter who knows absolutely everything, and deconstructs the generations, including mine, as a hobby, with laughable but completely sincere results.
 




<< Home
Reg Braithwaite


Recent Writing
Homoiconic Technical Writing / raganwald.posterous.com

Books
What I‘ve Learned From Failure / Kestrels, Quirky Birds, and Hopeless Egocentricity

Share
rewrite_rails / andand / unfold.rb / string_to_proc.rb / dsl_and_let.rb / comprehension.rb / lazy_lists.rb

Beauty
IS-STRICTLY-EQUIVALENT-TO-A / Spaghetti-Western Coding / Golf is a good program spoiled / Programming conventions as signals / Not all functions should be object methods

The Not So Big Software Design / Writing programs for people to read / Why Why Functional Programming Matters Matters / But Y would I want to do a thing like this?

Work
The single most important thing you must do to improve your programming career / The Naïve Approach to Hiring People / No Disrespect / Take control of your interview / Three tips for getting a job through a recruiter / My favourite interview question

Management
Exception Handling in Software Development / What if powerful languages and idioms only work for small teams? / Bricks / Which theory fits the evidence? / Still failing, still learning / What I’ve learned from failure

Notation
The unary ampersand in Ruby / (1..100).inject(&:+) / The challenge of teaching yourself a programming language / The significance of the meta-circular interpreter / Block-Structured Javascript / Haskell, Ruby and Infinity / Closures and Higher-Order Functions

Opinion
Why Apple is more expensive than Amazon / Why we are the biggest obstacles to our own growth / Is software the documentation of business process mistakes? / We have lost control of the apparatus / What I’ve Learned From Sales I, II, III

Whimsey
The Narcissism of Small Code Differences / Billy Martin’s Technique for Managing his Manager / Three stories about The Tao / Programming Language Stories / Why You Need a Degree to Work For BigCo

History
06/04 / 07/04 / 08/04 / 09/04 / 10/04 / 11/04 / 12/04 / 01/05 / 02/05 / 03/05 / 04/05 / 06/05 / 07/05 / 08/05 / 09/05 / 10/05 / 11/05 / 01/06 / 02/06 / 03/06 / 04/06 / 05/06 / 06/06 / 07/06 / 08/06 / 09/06 / 10/06 / 11/06 / 12/06 / 01/07 / 02/07 / 03/07 / 04/07 / 05/07 / 06/07 / 07/07 / 08/07 / 09/07 / 10/07 / 11/07 / 12/07 / 01/08 / 02/08 / 03/08 / 04/08 / 05/08 / 06/08 / 07/08 /