Pages

Saturday 28 July 2012

38. Complex Adaptive Systems

Complex adaptive systems (CASs) are complex systems that not only evolve like any other dynamical system, but also learn by making use of the information they have acquired.
Learning by a CAS requires, among other things, the evolution of an ability to distinguish between the random and the regular. CASs can undergo processes like biological evolution (or biological-like evolution). They do not just operate in an environment created for them initially, but have the capability to change the environment. For example, species, ant colonies, corporations, and industries evolve to improve their chances of survival in a changing environment. Similarly, the marketplace adapts to factors like immigration, technological developments, prices, extent of availability of raw materials, and changes in tastes and lifestyles. Some more examples of CASs are: A baby learning to walk; a strain of bacteria evolving resistance to an antibiotic; a beehive or ant colony adjusting to the decimation of a part of it; etc.

By contrast, complex materials are examples of nonadaptive complex systems (discussed in my book Smart Structures). Galaxies and stars and other such complex objects are additional examples of nonadaptive complex systems. They are inanimate systems which evolve with time, but within the unchanging constraints provided by the initial conditions and the environment.


I list here the characteristic features of CASs, also called vivisystems. I have in mind systems that are large in terms of numbers of individuals or agents comprising the group.



  1. There is a network of interactions among the large number of individuals in the group, acting in parallel.
  2. Individuals acquire information about the surroundings and about themselves.
  3. Each individual constantly reacts to what the others are doing. Therefore, from the vantage point of any individual, the environment is changing all the time.
  4. Individuals identify regularities in the information acquired by them, and condense those regularities into a schema or conceptual model. CASs are pattern seekers.
  5. The individuals act on the basis of that schema.
  6. There can be many competing schemata, and the most suitable ones survive and evolve, based on the feedbacks received from the interactions with the environment.
  7. The control in a CAS is highly dispersed. No one is really in command.
  8. Coherent behaviour or order in a CAS arises from both competition and cooperation among the individuals themselves.
  9. Emergent behaviour (cf. Part 33) results from competition and cooperation among the individuals.
  10. A CAS has many levels of self-organization. Individuals at one level serve as the building blocks for individuals at the next higher level of hierarchy. In the human brain, for example, one block of neurons forms the functional regions for speech, another for vision, and still another for motor action. These functional areas link up at the next higher level of cognition and generalization.
  11. In the light of new experience (obtained by feedback), CASs may constantly adjust and rearrange their building blocks. This forms the basis of all learning, evolution, or adaptation in CASs. CASs are thus characterized by perpetual novelty. The processes of learning, evolution, and adaptation are basically the same. One of their fundamental mechanisms is the revision and recombination of the building blocks.
  12. The CASs are constantly making predictions, thus anticipating the likely future. The predictions are based on various internal models of the world, and the models are constantly revised on the basis of new inputs; they are not static blueprints. Sheer large numbers and mutual exchange of information result in intelligence, the swarm intelligence.
  13. CASs have a certain dynamism not present in nonadaptive complex systems. And yet this dynamism is far from being total randomness. CASs have the ability to establish a balance between order and chaos. This balance line is referred to as the EDGE OF CHAOS. This line (or rather a hyper-membrane) in phase space represents the coexistence of order and chaos.
  14. Life signifies both stability and creativity, something that becomes possible in the vicinity of the edge of chaos.
  15. The CASs have many niches, each of which can be exploited by an agent which has adapted itself to fill that niche.
  16. Filling up of a niche opens up new niches. The system is just too large to be ever in equilibrium. There is perpetual novelty, the stuff biological evolution is made of.


As pointed out by Murray Gell-Mann in his great book The Quark and the Jaguar, the crux of a highly complex system is in its non-random aspects. He introduced the notion of EFFECTIVE COMPLEXITY, and defined it (relative to a CAS that is observing it) as roughly the length of a concise description of the regularities of that system or bit string. By contrast, algorithmic information content (AIC) (cf. Part 23) refers to the length of the concise description of the whole system or string, rather than referring to the regularities alone.


A CAS separates regularities from randomness. Therefore a CAS provides us the possibility of defining complexity in terms of the length of the schema used by it for describing and predicting an incoming data stream:

Suppose a bit stream is totally random. Then its AIC is infinite. But its effective complexity is zero, because a CAS observing the bit stream will not find any regularity in it, and therefore the length of the schema describing the regularities will be zero. At the other extreme, if the bit stream is totally regular, the AIC is very small (nearly zero), and so is the effective degree of complexity.


For intermediate situations, the effective complexity is substantial.

Thus, for effective complexity to be significant, the AIC must not be too high or too low. That is, the system should be neither too orderly nor too disorderly. For such situations, the AIC is substantial but not maximal, and it has two contributions: The apparently regular portion (corresponding to the effective complexity), and the apparently random or stochastic portion. Complexity thrives when there is a critical balance between order and chaos.

Saturday 21 July 2012

37. Ant Logic


Ants are the history of social organization and the future of computers (Kevin Kelly 1994).
Ants are social insects. And they occur in huge numbers. There are more than a million ants for every human. An ant is a small dumb creature, not able to see far. Considering its size, the landscape in which it moves around must appear very rugged to it. Then how is it that ants in an ant colony are able to find food rather rapidly and generally by the shortest route, in spite of the fact that there is no one in command of operations?

It is a case of swarm intelligence. In such a swarm, each individual has little or no intelligence, and it only follows some simple 'local rules', and yet the swarm as a whole ends up possessing intelligence.


It is instructive to understand the basic processes involved in an ant colony, the more so because ANT LOGIC has already found several applications in the field of 'artificial evolution' and in computational science. Dorigo and coworkers did some pioneering work in this area.


Ants in a colony have a distinctive communication mechanism, involving pheromones. Pheromones are chemicals that play the role of signals among members of the same species.

An ant colony is a remarkable parallel-processing superorganism. The ants function independently and simultaneously, and communicate with one another unknowingly via pheromones.

A number of scout ants set out in search of food, going in different directions independently and randomly. They emit the pheromone all the time, both while going away from the nest and while returning to it. It follows that a trail used by many ants will have a strong pheromone odour. The pheromone evaporates slowly, i.e. its strength on a trail is a decaying function of time.

Suppose one of the many scout ants has accidentally discovered the shortest usable path to food. Let us call it trail A (Part 3 in the figure below).


Then it will be able to travel to the food, and come back by the same path (guided by the pheromone trail), in the shortest time, compared to other ants which did not happen to take this route. The to and fro journey along trail A will result in twice as much pheromone along it, compared to a trail which is twice as long. Different ants traverse different trails, and the trails may intersect. At trail-crossings the ants divert to the trail with the strongest odour, thus further strengthening its odour (LAW OF INCREASING RETURNS, OR POSITIVE FEEDBACK). Ultimately, all ants follow the shortest route to food, in spite of the fact that no design work, or planning, or supervision, has gone into this. Swarm intelligence indeed.

Investigation of one type of complex system can provide insights into what may be happening in other complex systems. An obvious case in point is: How to understand human intelligence as a kind of swarm intelligence. Human intelligence emerges from the interactions among neurons, in spite of the fact that any particular neuron is as dumb as can be.

Ant logic has been applied, among other things, to the so-called travelling salesman problem (TSP). Suppose a salesman wants to visit every city in his route exactly once and then return home. The TSP is to determine the route for which the distance travelled (or the cost) is the least, and involves enumerating the possible distinct itineraries. Suppose the salesman has to visit, say, five cities and then return home. There are five ways of choosing the first destination city. For each of these, there are four different ways of choosing the second city. Having chosen any of these, there are only three ways of picking up the next city (because the salesman cannot touch any city twice). And so on. Thus the total number of possible itineraries is 5 x 4 x 3 x 2 = 120, or 'factorial 5' (denoted by '5!'). For each of these 120 options, one computes the total distance to be covered. The option with the least distance is the best.

This is a brute-force way of solving the TSP. But suppose the number of cities to be visited is 50, rather than 5, not an unrealistic number in modern days of air travel. The search space now comprises of '50!' possibilities; i.e., ~1064 (10 followed by 64 zeroes). This is a very large number indeed, and the girl doing the booking at an airline would have a very tough time trying to offer a good itinerary.

Algorithms for reducing the size of this search space must be found. At present it is not known whether a so-called 'good algorithm' actually exists which always gives a minimal solution. In fact, it is a so-called NP-complete problem.


When ant logic was applied for solving the TSP, surprisingly good results were obtained. Virtual ants ('vants') were created on a computer. Vants were dumb processors in a giant community of processors, operating in parallel. They had a meagre memory, and could communicate only locally, pheromone-like. Each vant would set out rambling from 'city' to 'city', leaving a trail of a time-decaying mathematical function, rather like the pheromone. The shorter the path between cities, the less the mathematical pheromone decayed with time. And the stronger the pheromone signal, the more the other vants followed that route (self-reinforcement of paths). Around 5000 runs enabled the vant group-mind to evolve a fairly optimal global itinerary.


Click HERE for some interesting information on human societies vs. ant colonies.

An ant colony is an example of a 'complex adaptive system' (CAS). More on that next time.

Saturday 14 July 2012

36. Fractals


The idea that Nature is full of 'fractal' configurations was first put forward and investigated by Benoit Mandelbrot in 1977. A fractal structure has scale invariance and self-similarity: It looks the same ('self-similar') at just about any level of magnification or change of scale. This is a consequence of the fact the same 'local rule' is in operation everywhere for the time or space evolution of the system.


The famous Koch's snowflake illustrates the point. The recipe (local rule) for creating it is very simple. A fractal object has an 'initiator' and a 'generator'. In the figure below the initiator is an equilateral triangle, shown in the top left corner.

For obtaining the generator, we partition each side of the triangle into three equal parts, remove the middle one-third, and replace the gap so created by two segments of the same length as the other segments in the form of a peak (i.e., we add an equilateral triangle at each gap). The resulting form is shown on the top right corner in the figure above.

This process can be repeated indefinitely, applying the generator procedure to each straight segment. The next two iterations are shown in the lower half of the figure.

Koch's snowflake looks complex, but has an underlying simplicity, once we have identified the generating mechanism. The self-similarity exhibited by chaotic dynamics (cf. Part 35), as also by other fractal patterns, points to the sameness of the underlying causes at all length scales. No wonder, an enormous number of natural entities have fractal shapes. The picture below (of a Romanesco broccoli) is an example.


Each time new triangles are added to the Koch snowflake figure, the length of the line increases. And yet the inner area of the curve remains less than the area of a circle drawn around the original triangle. It is a line of infinite length surrounding a finite area.

The word 'fractal' and the notion of 'fractal dimensions' were introduced to reflect this fact. The fractal dimension of the Koch snowflake is calculated to be ~1.26. The curve is coarser than a 1-dimensional smooth curve. Since it is more crinkly, it is better at taking up space. However, it is not as good at filling up space as a square (a 2-dimensional object). So it makes sense that the dimension of the Koch curve is a fraction, somewhere between 1 and 2.

Atmospheric phenomena have chaotic character (cf. Part 35). Their complexity is well-illustrated by our inability to make long-term predictions about weather with a high degree of reliability. As stated in Part 35, the chaotic nature of these phenomena was first discovered by Lorenz  when he was investigating a coupled system of ordinary differential equations, using a simplified model of 2-dimensional thermal convection, namely the so-called 'Rayleigh-BĂ©nard convection'.

The equations Lorenz formulated for weather phenomena are now called the Lorenz equations. There are three control parameters in the equations, usually denoted by σ, r, and b.

Suppose we take σ = 10.00 and b = 2.67, and let r be the variable control parameter. It is found that there is a critical value of r, namely rc = 24.74, at which there is a sudden change of behaviour. Below this value the system decays to a steady non-oscillating state, i.e. there is a stable 'fixed point' or attractor in phase space. For r > rc, continuous oscillatory behaviour is observed.

r = 28.00 produces aperiodic behaviour or 'deterministic nonperiodic flow'; in other words, chaos. The corresponding attractor is a strange attractor. The phase-space trajectories evolve around two distinct lobes.

The Lorenz strange attractor has an extremely complex structure. A further increase in the value of r causes a series of reverse bifurcations in phase space; i.e., the system moves back from chaotic orbits to periodic orbits. 

Computer-generated fractal art is a truly beautiful offshoot of the science of fractals.



Saturday 7 July 2012

35. Chaotic Systems



The dynamics of a system may be linear or nonlinear. Linear dynamics involves linear mathematical operators, which have the property that their action on a sum of two functions is equal to the sum of the action of the operator on each function.

Nonlinear dynamics means that the output is not linearly proportional to the input. Consider a system described by the equation y(t) = c x(t)2. If you double the input x, the output y does not double; it becomes four times. But if y(t) = c x(t), we are dealing with linear dynamics. Linearity respects the principle of superposition: If x1(t) and x2(t) are two solutions of an equation, then c1x1(t) + c2x2(t) is also a solution, where c1 and c2 are arbitrary constants. By contrast, nonlinear operators are such that, e.g., (x1(t) + x2(t))2x1(t)2 + x2(t)2.

In science, chaos is a technical term, signifying highly nonlinear dynamics, although not all nonlinear systems are chaotic. Apart from nonlinearity (or rather, because of strong nonlinearity), a chaotic system is characterized by unpredictable evolution in space and time, in spite of the fact that the differential equations or difference equations describing it are deterministic (if we can neglect 'noise').

Chaotic phenomena are everywhere. According to Kaneko and Tsuda (2000): 'In fact, chaos exists everywhere in our lives. We are tempted to imagine that if chaos is such an ordinary phenomenon, perhaps humans discovered chaos and defined the concept in ancient times. Actually, in many mythical stories chaos is described as a state in which heaven and earth are not divided (a state in which everything is mixed). Chaos is also described as an "energy body" responsible for the creation of heaven and earth'.

In ancient Indian philosophy, the concept of Brahmman was propounded for this 'energy body', and the 'state in which everything is mixed' is referred to as Pralaya. Chinese or Indian, these are just interesting old ideas, and have nothing to do with the science of chaos.

In the language of algorithmic information theory (cf. Part 23), chaos has the largest (but not infinite) degree of complexity. By contrast, random or noisy systems have an infinite degree of complexity by this definition.

Chaos theory is about finding the underlying order in apparently random data. It is about a certain class of nonlinear dynamical systems which may be either 'conservative' or 'dissipative'. Conservative systems do not lose energy over time.

A dissipative system, by contrast, loses energy; e.g. via friction. As a consequence of this, it always approaches some limiting or asymptotic configuration, namely an attractor in phase space.

The unpredictability feature of chaotic systems is because of extreme sensitivity to initial conditions. This is popularly referred to as the Butterfly effect:
The flapping of a single butterfly's wing today produces a tiny change in the state of the atmosphere. Over a period of time, what the atmosphere actually does diverges from what it would have done. So, in a month's time, a tornado that would have devastated the Indonesian coast doesn't happen. Or maybe one that wasn't going to happen, does (Ian Stewart).
Poincaré, near the end of the 19th century, had recognized the existence of chaos. The mathematics of celestial motion yielded complex solutions when the number of bodies was greater than as little as 3. But his work on chaos was not noticed, as also the work of several other scientists in the early 20th century. It was the meteorologist Edward Lorenz who in 1961 established the field of chaos theory as we know it today. Even his work went unnoticed for nearly a decade.

Lorenz was working on the problem of weather prediction. He started with the standard equations of fluid dynamics and simplified them greatly for carrying out his computer-simulation studies regarding the dynamics of the atmosphere. The remarkable discovery he made, by accident, was that the predictions his model made depended in a crucial way on the precision with which he specified the values of the three adjustable parameters in his equations. Two calculations, identical except that they differed in the value of one of these control parameters by, say, 0.000001, made totally different long-term predictions. The dynamics was not just nonlinear; even the slightest variations in the initial conditions gave wildly different results after a certain number of time steps.

Further investigations led to the discovery of what is now known as the Lorenz attractor . It was found that, although the results of the calculations were very sensitive to the values of the three control parameters, in every case the output always stayed on a double spiral in phase space.


This was new science. Till then only two types of dynamical systems were known: Those in steady state, and those in which the system undergoes periodic motion. But a chaotic system does not settle to a single-point attractor or a closed-loop attractor in phase space, although it is not random dynamics either. There is order, except that the phase-space trajectory never comes to the same point again. First one spiral is traced, and then the other, and then again the first (by a different path); so on.

The Lorenz attractor shown above belongs to a new family, called strange attractors. Such attractors have the 'self-similarity' feature: They have 'fractal' structure, meaning that the dimension of the structure is a fraction, rather than an integer. The fractional nature of the dimension is why the term 'strange' attractor is used.

I shall discuss fractals in the next post.


Dennis the Menace
Dennis: I want a job where I don't have to be right all the time.
Friend: You want to be a weatherman?