Pages

Saturday 14 April 2012

23. Natural Phenomena are just Computations




How much information (in terms of number of bits) is needed for specifying the set of all positive integers: 0, 1, 2, 3, . . . ? The sequence runs all the way to infinity, so does it have an infinite information content? Something is wrong with that assertion. We can see that we can generate the entire sequence by starting from 0, and adding 1 to get the next member of the set, and then obtain the next member by adding 1 again, and so on. So, because of the order or structure in this sequence of numbers, an algorithm can be set up for generating the entire set of numbers. And the number of bits needed to write the corresponding computer program is rather small.

The number of bits needed to write the computer program for generating a given set of numbers or data is called the 'algorithmic information content' (AIC) of that set of data. Algorithmic Information Theory (AIT) is the modern discipline which is a great improvement over classical information theory. But such ideas about COMPRESSION OF INFORMATION have a long history.

Leibniz (1675) was amongst the earliest known investigators of compressibility (or otherwise) of information. He argued that a worthwhile algorithm or theory of anything has to be ‘simpler than’ the data it explains. Otherwise, either the theory is useless, or the data are ‘lawless’. The idea of ‘simpler than’ is best expressed in terms of AIC defined above.

The information in a set of data can be compressed into an algorithm only if there is something nonrandom or ordered about the data. There must be some structure or regularity, and we must be able to recognize that regularity or 'rule' or 'law'. Then only can we construct the algorithm that generates the entire set of data.

In fact, this is how we discover and formulate the laws of Nature. And the statements of the laws are nothing but a case of compression of information, using a smaller number of bits than the number of bits needed for describing an entire set of observations about Nature.

Consider two numbers, both requiring, say, a million bits for specifying them to the desired accuracy. Let one of them be an arbitrary random number, which means that there is no defining pattern or order or structure for specifying it. Let the other number be the familiar π (= 3.14159…..). The second number has very small AIC because a small computer program can be written for outputting it to a desired level of precision (π is the ratio of the perimeter of a circle to the diameter of the circle). By contrast, a random number (say 1.47373..59) has a much higher AIC: The shortest program for outputting it has information content (in terms of number of bits) not very different from that of the number itself, and the computer program for generating it can be only this:

         Begin
         Print “1.47373..59”
         End

No significantly smaller program can generate this sequence of digits. The digit stream in this case has no redundancy or regularity, and is said to be incompressible. Such digit streams are called irreducible or algorithmically random.

Such considerations have led to the conclusion that there are limits to the powers of logic and reason. Gregory Chaitin has shown that certain facts are not just computationally irreducible; they are logically irreducible as well. The 'proof' of their ‘truth’ must be in the form of additional axioms, without any reasoning.


In science we use mathematical equations for describing natural phenomena. This approach has played a crucial role in the advancement of science for several centuries. However, the advent of computers has led to a paradigm shift:

THE LAWS OF NATURE CAN BE VIEWED AS COMPUTER PROGRAMS, AND THE BEHAVIOUR OF ANY SYSTEM CAN BE VIEWED AS CORRESPONDING TO A COMPUTATION.

I quote Seth Lloyd (2006):
The natural dynamics of a physical system can be thought of as a computation in which a bit not only registers a 0 or 1 but acts as an instruction: 0 means ‘do this’ and 1 means ‘do that’. The significance of a bit depends not just on its value but on how that value affects other bits over time, as part of the continued information processing that makes up the dynamical evolution of the universe.
Remember, the laws of Nature are quantum-mechanical. THE UNIVERSE IS ONE BIG QUANTUM COMPUTER.

As I shall explain in a later post, the computational approach to basic science and mathematics has also made us realize that there are limits to how far we can carry out our reasoning processes for understanding or describing natural phenomena. Beyond a limit, it is like in Alice in Wonderland:
‘Do cats eat bats? Do cats eat bats?’ Sometimes she asked, ‘Do bats eat cats?’ For you see, as she couldn’t answer either question, it didn’t much matter which way she put it.
More on this later, when I introduce you to Laplace's demon.

No comments:

Post a Comment