raganwald
Wednesday, March 22, 2006
  An Encounter with a Programming Interview Problem
From the NexTag Java Development Jobs Page:

Given a deck of nCards unique cards, cut the deck iCut cards from top and perform a perfect shuffle. A perfect shuffle begins by putting down the bottom card from the top portion of the deck followed by the bottom card from the bottom portion of the deck followed by the next card from the top portion, etc., alternating cards until one portion is used up. The remaining cards go on top. The problem is to find the number of perfect shuffles required to return the deck to its original order. Your function should be declared as:

static long shuffles(int nCards,int iCut);

Please send the result of shuffles(1002,101) along with your program and your resume to 'resume' at nextag.com.

I saw mention of this over on reddit.com. It couldn't have come at a better time. I'd just read a provocative essay on programming's mediocracy: Beating the Average makes the point that programmers are terrible at judging their own skills. We all think we're good enough, and our environment does not promote the kind of objective competition and ranking that would shake us out of our complacency.

There's a name for this: The Lake Wobegon Effect. (This is related to the Blub Paradox, of course). How do we break out of this comfort zone? One way is by solving puzzles and problems where our solution is ranked against others. With that in mind, I decided to try this puzzle over lunch today.

My code runs in sixteen milliseconds or so on JDK 1.5/Eclipse in a garden-variety PC. After googling around, I discovered that the first version of my code could have been tighter and faster. That's terrific: I actually learned something by not being the best and the brightest today.

But worries me is this: how do I know whether my code could be even better? The Blub paradox dooms me to thinking I've got the answer, when perhaps I could be even better than I am. (If my code could be better, please comment or email. I want to hear from you!)

My tip is to seek out problems like this from time to time. Don't settle for thinking up the approach in your head and considering the actual coding to be an insignificant exercise in typing. Go ahead and write clean, debugged code, preferably with a time limit to put some pressure on yourself.

(By the way, some people argue that this type of problem has little to do with Enterprise stacks and all the other cruft of a daytime gig. This is a thinly veiled excuse for not trying, and roughly equivalent to an NFL linebacker refusing to do the bench press because an actual football game is more complicated than simply pushing an iron weight over his head while lying flat on his back.)

Labels:

 

Comments on “An Encounter with a Programming Interview Problem:
Hmm. Looks like you're actually shuffling the decks in the program. It seems (although I definitely haven't thought this thru) you could get the answer with a formula. All they're asking for is how many shuffles it would take, so perhaps you don't actually need to shuffle, you can figure out the math behind it.

I purposely haven't googled this or looked at the code in detail, would like to try this out when I get a minute, so I may be way off.
 
how do I know whether my code could be even better?

Decades of computer science have not found an answer to this. How do we know there is no faster sort-algorithm than quicksort or better than O(n*log(n)) ?
 
Thanks for pointing out this fun puzzle. If you are interested in seeing a solution in Haskell, I put mine up on my blog:

The "perfect shuffles" puzzle (solved in Haskell)

Cheers,
Tom
 
"How do we know there is no faster sort-algorithm than quicksort or better than O(n*log(n))"

Uh, (1) we do, for some kinds of input there are O(n) sorts (bucket & radix) and (2) because you can derive an lower bound, O(n log n) for sorts (if you only have a general compare function) by looking at the depth of the decision tree needed to decide between all possible permutations of the list. And since we already have O(n log n) you've got an upper bound too, so you're done.

Now whether you can get a smaller constant up front, or there's some assumption you can make that allows a better solution, that's the rub.
 
http://blogsummaries.blogspot.com/ has a simple and elegant solution for this. Runs in 0 ms from time to time, 10 ms otherwise (due to System.currentTimeMillis() being imprecise).
 
"Runs in 0 ms from time to time, 10 ms otherwise (due to System.currentTimeMillis() being imprecise)"

Now I know why I sometimes get 16ms and sometimese zero, but nothing else :-)
 
Okay, I know I'm coming late to the show here, but... You've fallen into a trap with a "general solution to a particular problem." You could have written a "correct" solution in 5 lines of Java code containing just System.out.println("5812104600"); that executed in 1ms, but obviously the intention here was to write an algorithm, so...

First the code contains a bug. You checked the one stated test case, but did you try others? How about edge cases? Try nCards=3 and iCut=2.

Second your algorithm has a quadratic time complexity, but is "good enough" given the small domain space of the stated problem. What if nCards=100002 and iCut=101? On my system your code took over 3 minutes and executed the inner-most loop 1.4 billion times.

I took a swipe at this problem and wrote a linear time algorithm and thought I was fairly clever... 'til I read the comments of the O'Reilly solution and noticed someone had already posted what is essentially a linear time optimization to that solution. Still, it ran the 100002,101 test case in 31 milliseconds and executed the innermost loop 100002 times (it really is O(n)).

And just because I'm truly shameless, here's a link to the blogger blog I just created for it.
 
Wavicle:

Yes, I'm aware of the improved algorithm. However, the code presented above is more-or-less what I came up with immediately, and that's what I would have written had I been asked to compose an answer on the spot.

Should I change it to reflect further research? I didn't think that would be appropriate. But I'm glad you've blogged your solution and especially glad that you have provided a concise explanation of the differences in complexity.

Thanks.
 




<< Home
Reg Braithwaite


Recent Writing
Homoiconic Technical Writing / raganwald.posterous.com

Books
What I‘ve Learned From Failure / Kestrels, Quirky Birds, and Hopeless Egocentricity

Share
rewrite_rails / andand / unfold.rb / string_to_proc.rb / dsl_and_let.rb / comprehension.rb / lazy_lists.rb

Beauty
IS-STRICTLY-EQUIVALENT-TO-A / Spaghetti-Western Coding / Golf is a good program spoiled / Programming conventions as signals / Not all functions should be object methods

The Not So Big Software Design / Writing programs for people to read / Why Why Functional Programming Matters Matters / But Y would I want to do a thing like this?

Work
The single most important thing you must do to improve your programming career / The Naïve Approach to Hiring People / No Disrespect / Take control of your interview / Three tips for getting a job through a recruiter / My favourite interview question

Management
Exception Handling in Software Development / What if powerful languages and idioms only work for small teams? / Bricks / Which theory fits the evidence? / Still failing, still learning / What I’ve learned from failure

Notation
The unary ampersand in Ruby / (1..100).inject(&:+) / The challenge of teaching yourself a programming language / The significance of the meta-circular interpreter / Block-Structured Javascript / Haskell, Ruby and Infinity / Closures and Higher-Order Functions

Opinion
Why Apple is more expensive than Amazon / Why we are the biggest obstacles to our own growth / Is software the documentation of business process mistakes? / We have lost control of the apparatus / What I’ve Learned From Sales I, II, III

Whimsey
The Narcissism of Small Code Differences / Billy Martin’s Technique for Managing his Manager / Three stories about The Tao / Programming Language Stories / Why You Need a Degree to Work For BigCo

History
06/04 / 07/04 / 08/04 / 09/04 / 10/04 / 11/04 / 12/04 / 01/05 / 02/05 / 03/05 / 04/05 / 06/05 / 07/05 / 08/05 / 09/05 / 10/05 / 11/05 / 01/06 / 02/06 / 03/06 / 04/06 / 05/06 / 06/06 / 07/06 / 08/06 / 09/06 / 10/06 / 11/06 / 12/06 / 01/07 / 02/07 / 03/07 / 04/07 / 05/07 / 06/07 / 07/07 / 08/07 / 09/07 / 10/07 / 11/07 / 12/07 / 01/08 / 02/08 / 03/08 / 04/08 / 05/08 / 06/08 / 07/08 /