Open problems in metabiology.
(We are all random walks in program space.)

October 4th, 2009

Gregory Chaitin is on tour promoting his new field – metabiology. As Chaitin conceives it, metabiology is the study of the evolution of computer programs, with the goal of proving theorems concerning the circumstances under which evolution occurs. It’s ultimate goal, as the name suggests, is proving that under Earth-like conditions, DNA-based computers must evolve.

Key to Chaitin’s notion of evolution is something he calls creativity, and he explored this idea a little bit in a talk at the University of Toronto’s Centre for Mathematical Medicine. To understand his first theorems in this area, you need to (roughly) understand the Busy Beaver problem of Tibor Rado. A good precis is here. Essentially, a busy beaver is a Turing machine that operates as long as possible, and then halts. The Busy Beaver function, BB(n), is the highest whole number produced by an n-bit busy beaver.

So, to Chaitin’s first theorem in metabiology …

He begins with a single organism – a Turing machine. He mutates this organism, and then either keeps the original and throws away the mutant, or vise-versa, depending on which is more fit. The fitness function is based on the Busy Beaver problem. If the mutant halts, and, upon halting, produces a higher whole number than the original, then the mutant wins. If not, it loses.

Now, BB(n) is uncomputable. In fact, it has no computable bound. Nevertheless, Chaitin shows that random mutations will, in exponential time (on the number of bits, n, in the organism), result in the computation of the Busy Beaver function for n!
(That was an exclamation point, not a factorial sign.)

In other words, evolution causes fitness to increase faster than any computable function. Chaitin calls this “evidence of biological creativity”. This is a nice result, but is one that Chaitin finds less than satisfactory. In real life evolution is cumulative, while Chaitin’s proof requires assuming that evolution sometimes starts over from scratch. He really wants to prove an evolutionary process that is, in some sense, cumulative, in addition to being creative. His second theorem uses his infamous halting probability, Ω, to construct a cumulative path through program space to arbitrary levels of complexity. But this also doesn’t satisfy Chaitin, since the process is unstable, in a sense that he didn’t really explain.

Beyond these two theorems, the field is open. Things to work on seem to be:

i. Without changing the model, can Chaitin’s desired result (cumulative evolution) be proved?

ii. Part of the utility of Chaitin’s fitness function is that it explicitly rewards complexity. This fits with the observation that life, in general, evolves to become more complex. But complexity is, I think, typically seen as an epiphenomenon of fitness, and not as the very definition of fitness. Can a “Darwinian” fitness function be chosen such that complexity is not explicitly rewarded AND such that life can be proven to evolve to arbitrary complexity?

iii. Once we exhaust the limits of what we can prove without an environment, what happens when we introduce an environment, which interacts with the organism, exchanges information with the organism, and which can change, suddenly or gradually?

Of course, (iii) might not be necessary. If (ii) can be proven, then, in a sense, case closed: life must evolve. Some might even say that (ii) isn’t necessary.

But I suspect that Chaitin expects (i) to be very hard. Hence his enthusiasm. In fact, I suspect that he suspects that Ω in going to be all over metabiology, and that some of its fundamental questions will prove to be (mathematically) unknowable.

But algorithmic information theory (AIT) is only one extra-biological approach to evolution. Another is thermodynamics. Eric Chaisson, for example, argues that the Earth, bathed in solar radiation, has a natural tendency towards lower entropy and higher complexity. Is an AIT/Thermodynamics synthesis possible? Google says: Yes, (and it’s been around a while).