It is perhaps a sobering thought that we seem so inconsequential in the Universe. It is even more humbling at first – but then wonderfully enlightening – to recognize that evolutionary changes, operating over almost incomprehensible space and nearly inconceivable time, have given birth to everything seen around us. Scientists are now beginning to decipher how all known objects – from atoms to galaxies, from cells to brains, from people to society – are interrelated (Chaisson 2002).
At the moment
of the Big Bang, the information content of the universe was zero, assuming
that there was only one possible initial state and only one self-consistent set
of physical laws. When spacetime began, the information content of the quantum
fields was nil, or almost nil. Thus, in the beginning, the effective complexity
(cf. Part 38) was zero, or
nearly zero. This is consistent with the fact that the universe emerged out of nothing.
As the early
universe expanded, it pulled in more and more energy out of the quantum fabric
of spacetime. Under continuing expansion, a variety of elementary particles got
created, and the energy drawn from the underlying quantum fields got converted
into heat, meaning that the initial elementary particles were very hot and
increasing in number rapidly, and therefore the entropy of the universe
increased rapidly. And high entropy means that the particles require a large
amount of information to specify their coordinates and momenta. This is how the
degree of complexity of the universe grew in the beginning.
Soon after
that, quantum fluctuations resulting in density fluctuations and clumping of
matter made gravitational effects more and more important with increasing time.
And the present extremely large information content of the universe results, in
part, from the quantum-mechanical nature of the laws of physics. The language
of quantum mechanics is in terms of probabilities, and not certainties. This
inherent uncertainty in the description of the present universe means that a
very large amount of information is needed for the description.
But why does
the degree of complexity go on increasing? To answer that, I have to refer to
the concept of algorithmic probability (AP) introduced in Part 34 while
discussing Ockham’s razor. Ockham’s razor ensures that short and simple
programs or 'laws' are the most likely to explain natural phenomena, which in
the present context means the explanation of the evolution of complexity in the
universe. I explained this by introducing the metaphor of an unintelligent
monkey, typing away randomly the digits 1 and 0, each such sequence of binary
digits offering a possible 'simple program' for generating an output that may
explain a set of observations.
The
quantum-mechanical laws of physics are the simple computer programs, and the
universe is the computer (cf. Part 23). But what is
the equivalent of the monkey, or rather a large number of monkeys, injecting
more and more information and complexity into the universe by programming it
with a string of random bits? According to Seth Lloyd (2006), ‘quantum fluctuations are the monkeys that
program the universe’.
The current
thinking is that the universe will continue to expand, and that it is spatially
infinite (according to some experts). But the speed of light is not infinite.
Therefore, the causally connected
part of the universe has a finite size, limited by what has been called the ‘horizon’ (Lloyd 2006). The quantum
computation being carried out by the universe (cf. Part 23) is confined
to this part. Thus, for all practical purposes, the part of the universe within
the horizon is what we can call ‘the universe'. As this universe expands, the
size of the causally connected region increases, which in turn means that the
number of bits of information within the horizon increases, as does the number
of computational operations. Thus the expanding universe is the reason for the
continuing increase in the degree of complexity of the universe.
The expansion
of the universe is a necessary cause (though perhaps not a sufficient cause)
for all evolution of complexity, because it creates gradients of various kinds: The expansion of the universe is a necessary cause (though perhaps not a
sufficient cause) for all evolution of complexity, because it creates gradients of various kinds: 'Gradients forever having been enabled by the
expanding cosmos, it was and is the resultant flow of energy among innumerable
non-equilibrium environments that triggered, and in untold cases still
maintains, ordered, complex systems on domains large and small, past and
present’ (Chaisson 2202). The
ever-present expansion of the universe gives rise to gradients on a variety of
spatial and temporal scales. And,
‘it is the contrasting temporal behaviour of various energy densities that has
given rise to those environments needed for the emergence of galaxies, stars,
planets, and life (Chaisson
2002).
In the grand cosmic scenario, there was only physical evolution in the beginning, and it prevailed for a very
long time. While the physical evolution still continues, the emergence of life
started the phenomenon of biological evolution:
Although it is difficult to say why the universe is so organized, the measured universal expansion since the Big Bang of space continues to provide a “sink” (a place) into which stars as sources can radiate: A progenitive cosmic gradient, the source of the other gradients, is thus formed by cosmic expansion. For the foreseeable future the geometry of the universe’s expansion continues to create possibilities for functionally creative gradient destruction, for example, into space and in the electromagnetic gradients of stars. Once we grasp this organization, however, life appears not as miraculous but rather another cycling system, with a long history, whose existence is explained by its greater efficiency at reducing gradients than the nonliving complex systems it supplemented (Margulis and Sagan 2002).
No comments:
Post a Comment