The 128-bit programming challenge
Information wants to be free
Here’s a programming challenge:
Write a program that produces 128 specific bits of output. In the simplest case, output those 128 bits in a standard numeric form such as sixteen pairs of hexadecimal digits. Your program may not contain the 128 bits in literal form, it must manufacture them in some way.
Why settle for third- and fourth- hand explanations and retelling of the most important work ever conducted in Computer Science and Cryptography? The Essential Turing presents Alan Turing’s original writings, lectures, and correspondence on the subjects of Computability, Logic, Philosophy, Artificial Intelligence, and Artificial Life (his last work), in an easily readable form.
Furthermore:
- Novelty counts. Try to think of a really unique way to generate the bits.
Imagine you are determined to protect our freedom to program. Once the Monopolistic Overloards detect that a particular program produces illegal codes, they will send cease-and-desist letters to anyone distributing that program. Therefore, the greater good is served by collecting many different ways to generate the 128 bits of information.
Or if you prefer a less weighty objective, imagine this is a job interview question. If the interviewers have seen your solution before, you won’t stand out from the crowd.
- You may express the 128 bits of output in any other form you like, however more complex or obscure formulations—such as steganographic images—should be accompanied by a method or program to recover the output into numeric form.
- Oh yes, the bits. Your program must produce
00001001
11111001
00010001
00000010
10011101
01110100
11100011
01011011
11011000
01000001
01010110
11000101
01100011
01010110
10001000
11000000
, or any equivalent representation such as 09 F9 11 02
9D 74 E3 5B
D8 41 56 C5
63 56 88 C0
, as its output.
Come on folks, we’re
programmers. We probably blew millions of person-hours writing programs that
counted from one to one hundred. Let’s stand up and prove that when the chips are down, we can invest a few minutes and write something that’s both interesting to code
and meaningful.
What’s the smallest program that produces the 128 bits? Can it be done in pure lambda calculus? Is there a self-modifying program that turns itself into the 128 bits? Can it be done with an Enterprisey Servive-Oriented Architecture? Haskell? C#? Let’s really do this right!!
Follow-up: Quality.