The Scope of Complexity
Long after the discovery of atoms and molecules it was still customary in science to think about a collection of many similar objects in terms of some “representative individual” endowed with the sum, or average of their individual properties. With the exception of particles physics and condensed matter theory where renormalization group effects were fully recognized, scientists in various disciplines continued their research within the "mean field" framework.
In fact, one may argue that this “mean field” / continuum / linear way of thinking is what conserved the classical sciences as independent sub-cultures. Indeed, the great conceptual jumps separating the various sciences and the accompanying paradoxes connected to the nature of life, intelligence, culture arise exactly from the failure of these assumptions. When "More Is Different" life emerges from chemistry, chemistry from physics, conscience from life, social conscience/ organization from individual conscience etc. (The title of the present article associates the beginnings of complexity with the article "More Is Different" published 30 years ago by Phil Anderson ).
This study of the emergence of new collective properties qualitatively different from the properties of the “elementary” components of the system breaks the traditional boundaries between sciences: the “elementary” objects belong to one science - say chemistry - while the collective emergent objects to another one - say biology. As for the methods, they fall “in between”: in the “interdisciplinary space”. The ambitious challenge of the Complexity research (its “manifest destiny”) is prospecting, mapping, colonizing and developing this “interdisciplinary” territory . For a visual impression of the fields and subjects involved in the synthesis that complexity tries to achieve see the attached map.
(for a more detailed interactive version see
Map of Complexity and its Neighbouring Fields
Theoretical and Phenomenological Origins of Complexity
Many of the crucial ingredients of Complexity appeared in the context of Theoretical Physics. In fact Anderson listed as his preferred examples phenomena which take place in physical systems: superconductivity, superfluidity, condensation of nucleons in nuclei, neutron stars, glasses.
He emphasized that in spite of the fact that microscopic interactions in the above phenomena are very different they can be all explained as realizations of a single dynamical concept: Spontaneous Symmetry Breaking. Therefore, the mere fact that various phenomena fall superficially in different empirical domains should not discourage scientists to study them within a unified conceptual framework . This birth gift of an extreme unifying potential haunted in the intervening 30 years the Complexity research as its main blessing and curse.
Discreteness and Autocataliticity as Complexity Origins
The discrete character of the individuals turned out to be crucial for the macroscopic behaviour of complex systems. In fact, in conditions in which the (partial differential) continuum approach would predict a uniform static world, the slightest microscopic granularity insures the emergence of macroscopic space-time localized collective objects with adaptive properties which allow their survival and development .
The exact mechanism by which this happens depends crucially on another unifying concept appearing ubiquitously in complex systems: auto-catalyticity. The dynamics of a quantity is said to be auto-catalytic if the time variations of that quantity are proportional (via stochastic factors) to its current value. It turns out that as a rule, the "simple" objects (or groups of simple objects) responsible for the emergence of most of the complex collective objects have auto-catalytic properties.
Autocatalyticity insures that the behaviour of the entire system is dominated by the elements with the highest auto-catalytic growth rate rather than by the typical or average element . This explains the conceptual gap between sciences: in conditions in which only a few exceptional individuals / events dominate, it is impossible to explain the behaviour of the collective by plausible arguments about the typical or "most probable" individual / event. In fact, in the emergence of nuclei from nucleons, molecules from atoms, DNA from simple molecules, humans from apes, there are always the un-typical cases (with accidentally exceptional advantageous properties) that carry the day.
Autocatalytic stochastic growth and power laws
One of the early hints of complexity was the observation in 1897 by Pareto that the wealth of individuals spreads over many orders of magnitude (as opposed to the size of a person which ranges roughly between 1/2 meter and 2 meters). The dynamics of the social wealth is then not dominated by the typical individual but by a small class of very rich people. Mathematically one realized that instead of the usual fixed scale distributions (Gaussian, exponential), the wealth follows a "power law" distribution . Moreover, in spite of the wide fluctuations in the average wealth during crises, booms, revolutions, the exponent of the power laws has remained between narrow bounds for the last 100 years.
Similar effects were observed in a very wide range of measurements: meteorite sizes, earthquakes, word frequencies and lately internet links. In all these systems, the presence of power laws constitutes a conceptual bridge between the microscopic elementary interactions and the macroscopic emergent properties. It turns out that the autocatalytic character of the microscopic interactions governing these systems can explain this behaviour in a generic unified way: by taking the logarithm of the variables, random changes proportional to the present value become random additive changes. This brings auto-catalytic dynamics within the realm of statistical mechanics and its powerful methods can be applied efficiently .
The Language of Dynamical Networks
The unifying power of the Complexity view is expressed among other in the emergence of a common language which allows the quick, effective and robust / durable communication and cooperation between people with very different backgrounds. One of these unifying tools is the concept of dynamical network . Indeed, one can think about the “elementary” objects (belonging to the “simpler” level) as the nodes of the network and about the “elementary” interactions between them as the links of the network. The dynamics of the system is then represented by (transitive) operations on the individual links and nodes ((dis)appearance, substitutions, etc.).
The global features of the network correspond to the collective properties of the system that it represents: (quasi-)disconnected network components correspond to (almost-)independent emergent objects; scaling properties of the network correspond to power laws, long-lived (meta-stable) network topological features correspond to (super-)critical slowing down dynamics. In this way, the mere knowledge of the relevant emerging features of the network might be enough to devise methods to expedite by orders of magnitude desired processes (or to delay or stop un-wanted ones). The mathematical tools implementing it are developed presently and include multi-grid and cluster algorithms.
Multigrid and Clusters
The mathematical counterpart to the physicist's Renormalization Group is the Multigrid tradition. In the last decade the two have interacted profitably and their relative strengths and weaknesses were complemented. A direction with a particular conceptual significance is the Algebraic Multigrid .
The Algebraic multigrid basic step is the transformation of a given network into a slightly coarser one by freezing together a pair of strongly connected nodes into a single representative node. By repeating this operation iteratively, Algebraic Multigrid ends up with nodes which stand for large collections of strongly connected microscopic objects. The algorithmic advantage is that the rigid motions of the collective objects are represented on the coarse network by the motion of just one object. One can separate in this way the various time scales. For instance, the time to separate two stones connected by a weak thread is much shorter than the time that it takes for each of the stones to decay to dust. If these two processes are represented by the same network then one would have to represent time spans of the order of millions of years (typical for stone decay) with a time step of at most 1 second (the typical time for the thread to break). The total number of time steps would become unbearably large. The Multi-grid procedure allows the representation of each sub-process at the appropriate scale. At each scale the collective objects which can be considered as "simple" elementary objects at that scale are represented by just one node. This is a crucial step whose importance transcends the mere speeding up of the computations. By labeling the relevant collective objects at each scale, the algorithm becomes an expression of the understanding of the emergent dynamics of the system rather than a mere tool towards acquiring that understanding. Multigrid (and their cousins - Cluster) algorithms have the potential to organize automatically the vast amounts of correlated information existing in complex systems such as the internet, fNMR data, etc.
Much of the present Complexity work may be thought as an application (with appropriate adjustments) of the table proposed 30 years ago by Anderson where the "simpler" science appears in the second column and the "more complex" one in the first:
Atomic physics - elementary particles
Chemistry - Atomic physics
Molecular Biology - Chemistry
Cell Biology - Molecular Biology
Psychology - Physiology
Social Sciences - Psychology
Below is an incomplete list of particular complexity directions substantiating this table. Of course, when looking for complexity one should keep in mind that "when you carry a hammer, a lot of things look like nails". Some things might still be simple.
Society - The emergence of traffic jams from single cars
The traffic simulation is an ideal laboratory for the study of complexity: the network of streets is highly documented and the cars motion can be measured and recorded with perfect precision. Yet the formation of jams is a very no-trivial consequence of the individual car events. Simpler, but not less important projects might be the motion of masses of humans in structured places, especially under pressure (in stadiums as a match ends, or in theaters during alarms). The social importance of such studies is measured in many human lives (see http://www.helbing.org and references therein for further information).
From customers to markets
The traditional approach in the product diffusion literature, is based on differential equations and leads to a continuous sales curve. This is contrasted with the results obtained by a discrete model that represents explicitly each customer and selling transaction. Such a model leads to a sharp (percolation) phase transition that explains the polarization of the campaigns in hits and flops for apparently very similar products and the fractal fluctuations of the sales even in steady market conditions .
The emergence of financial markets from investors
The financial economics has a long history of using precise mathematical models to describe the market behaviour. However, in order to be tractable, the classical market models (the Capital Asset Pricing Model, the Arbitrage Pricing Theory, the Option Valuation Black-Scholes formula) made assumptions which are found invalid by the behavioural finance and market behaviour experiments. By using the direct computer representation of the individual investors' behaviour, one can study the emergence of the (non-equilibrium) market dynamics in the presence of completely realistic conditions . The simulations performed until now have already suggested generic universal relationships which were abstracted and then taken up for theoretical study in the framework of stylized models .
Interactive Markets Forecast and Regulation
After loosing a fortune in a bubble (triggered by the South Sea Co.) in 1720 at the London Stock, Sir Isaac Newton was quoted to say: "I can calculate the motions of the heavenly bodies, but not the madness of people." It might seem over-ambitious to try where Newton has failed but let us not forget that we are 300 years later, have big computers and have had plenty of additional opportunities to contemplate the madness of people.
Understanding and regulating the dynamics of the (financial) markets is in some ways similar to predicting and monitoring weather or road traffic, and at least as important: One cannot predict individual car accidents but one can predict based on the present data the probable behaviour of the system as a whole. Such prediction ability allows the optimization of system design as well as on-line intervention to avert unwanted disturbances etc. Moreover one can estimate the effect of unpredictable events and prepare the reaction to them.
It is certainly a matter of top priority that the public and the authorities in charge of economic stability will have at their disposal standard reliable tools of monitoring, analysis and intervention .
Horizontal Interaction Protocols and Self-Organized Societies
The old world was divided in distinct organizations: some small (a bakery, a shoe store) and some large (a state administration, an army). The way to keep it working was for the big ones to have a very strict hierarchical chain of command and for the small ones (which couldn't support a hierarchy) to keep everybody in close "horizontal" personal contact. With the emergence of the third sector (public non-profit organizations), with the emergence of fast developing specialized activities, with the very lively ad-hoc merging and splitting of organizations, the need for lateral (non-hierarchical) communication in large organizations has increased. Yet, as opposed to the hierarchical organization, nobody knows how to make and keep under control a non-hierarchical organization. The hope is that some local protocols acting at the "local" level may lead to the emergence of some global "self-organizing" order. The study and simulation of such systems might lead to the identification of modern " Hammurapi codes of laws" with which to regulate (and defend) the new "distributed" society.
Biology - The emergence of the Immune Self from immune cells
The immune system is a cognitive system: its task is to gather antigenic information, make sense out of it and act accordingly. The challenge is to understand how the system integrates the chemical signals and interactions into cognitive moduli and phenomena. Lately, a few groups adopted the method of representing in the computer the cells and enzymes believed to be involved in a immune disease, implement in the computer their experimentally known interactions and reactions and watch the emergence of (auto-)immune features similar with the ones observed in nature . The next step is to suggest experiments to validate/ amend the postulated mechanisms.
Identifying and Manipulating the "Atoms" of Life
The situation in molecular biology, genetics and proteonics today resembles the situation of Zoology before Darwin and of Chemistry before the periodic table: "everything" is known (at least all the human genes), some regularity rules are recognized, but the field lacks an unifying dynamical principle. In particular the dynamics of "folding" (the process that gives the proteins their shape given a certain base sequence) and the relation between each protein shape and its function are anybody's guess.
In principle it is arguable that these problems can be solved within the borders of the present techniques and concepts (with some addition of data mining and informatics). However, I would bet rather on the emergence of new concepts, in terms of which this "total mess" would become "as simple" as predicting the chemical properties of elements in terms of the occupancy of their electronic orbitals. So the problem is: what are the "true" relevant degrees of freedom in protein/ genes dynamics? Single bases / nucleic acids are "too small"; alpha chains or beta sheets - too big. See the new ComPlexUs journal www.karger.ch/journals/cpu/cpu_jh.htm for relevant interdisciplinary efforts to solve this problem. Of course answering it will transform the design of new medicines into a systematic search rather than the random walk that is today.
Cognition - The emergence of Perceptual Systems
(the example of the visual system)
The micro-to-macro paradigm can be applied to a wide range of perceptual and functional systems in the body. The main steps are to find the discrete microscopic degrees of freedom, their elementary interactions and to deduce the emergent macroscopic degrees of freedom and their effective dynamics. In the case of the visual system this generic program is quite advanced. By using a combination of mathematical theorems and psychophysical observations one identified the approximate, ad-hoc algorithms that the visual system uses to reconstruct 3 D shapes from 2 D image sequences . As a consequence, one predicted specific visual illusions that were dramatically confirmed by experiment. This kind of work can be extended to other perceptual systems and taken in a few directions: guidance for medical procedures, inspiration for novel technology, etc.
Microscopic Draws and Macroscopic Drawings
The processes of drawing and handwriting (and most of the thought processes) look superficially continuous and very difficult to characterize in precise terms. Yet lately it was possible to isolate very distinct discrete spatio-temporal drawing elements and to put them in direct relation to discrete mental events underlying the emergence of meaningful representation in children . The clinical implications e.g. for (difficulties in) the emergence of writing are presently studied. This realization that there are intermediate (higher than neuron) scale "atoms" in the cognitive processes is very encouraging for the possibility to apply complexity methods in this field.
Conceptual Structures with Transitive Dynamics
Dynamical networks were mentioned as a candidate for a "lingua franca" among complexity workers. The nodes are fit to represent system parts / properties while the links can be used to represent their relationships. The evolution of objects, production processes, ideas, can then be represented as operations on these networks .By a sequence of formal operations on the initial network one is led to a novel network. The changes enforced in the network structure amount to changes in the nature of the real object. The sequence of operations leading to novel objects is usually quite simple, mechanical, well defined and easy to reproduce.
It turns out that a handful of universal sequences (which have been fully documented) are responsible for most of the novelty emergence in nature. Incidentally, ideas produced by a computer that applied one of these sequences obtained (from double-blind humans) higher inventiveness marks than the ideas produced by (a second group of) humans.
The aim of Complexity is to express, explain and control the collective objects and phenomena emerging at a certain space-time scale from the simpler interactions of their components at a finer scale. This is a sort of extension of the stochastic "atomic-molecular" methods to social, biological and cognitive problems.
The interdisciplinary integration that this implies is not just a juxtaposition of various expertises but rather a much more intimate fusion of knowledge. It rather involves a coordinated shift in the very objectives, scope and ethos of the affected disciplines. Complexity is not offering just a way of answering a question from one science using concepts from another: it is introducing a new language which allows the formulation of novel questions or rather a new grammar which allows novel interrogative forms.
Bringing people from these disciplines together is not enough. These fields have very different "cultures": different objectives, criteria of success, techniques and language. A deep shift in their way of thinking is necessary.
To realize it requires "growing" a new generation of "bilingual" young scientists that will produce the necessary synthesis in their own minds.
Complexity induces a new relation between theoretical and applied science. In the past, as technology was acting on hardware objects, applied science was mainly experimental science applied to real life situations. Today, when technology is acting on information, applied science consists often of theoretical / abstract operations applied to real life information items. One may have to get used to the expression "Theoretical Applied Science".
Source: This article is based on a plenary talk by Prof. Sorin Solomon at the 12-th European Physical Society General Meeting. The research was supported in part by the Israeli Science Foundation. We are greatful to Prof. Sorin Solomon and to the EPS for granting us permission to reproduce this article in PhysicaPlus Online.
1. P.W. Anderson, More is Different, Science v177, (1972) p 293
2. S. Solomon, The Microscopic Representation of Complex Macroscopic
Phenomena; Critical Slowing Down - A blessing in Disguise. In Annual
Reviews of Computational Physics II pp 243-294, editor D. Stauffer,
World Scientific 1995.
3. Anderson, P. W. (1995). "Physics: The opening to complexity." Proceedings
Natl. Acad. Science (USA) 92: 6653-6654.
4. N. Shnerb, et al. , The importance of being discrete: Life always wins on the
surface, Proc. Natl. Acad. Sci. USA, Vol. 97, Issue 19, 10322-10324,
September 12, 2000.
5. Y. Louzoun, et al., Modeling complexity in biology, Physica A , 297 (1-2)
(2001) pp. 242-252 .
6. Rosario N. Mantegna, H. Eugene Stanley, An Introduction to Econophysics:
Correlations and Complexity in Finance, Cambridge University Press;
(November 1, 1999)
7. O. Malcai et al. , Theoretical analysis and simulations of the generalized
Lotka-Volterra model, Phys. Rev. E 66, 031102 (2002).
8. S. Bornholdt, H.G. Schuster (eds.), Handbook of Graphs and Networks ,
From Genome to the Internet, Willey-VCH (2003 Weinheim)
9. I. Stenhill, et al., Dynamical Algebraic multi-grid simulations of free fields on
RTS, Computer Physics Communications 83 (1994) 23-29.
10. J. Goldenberg, et al, Marketing percolation, Physica A, 284 (1-4) (2000) pp.
11. M. Levy, et al. Microscopic Simulation of Financial Markets; From Investor
Behaviour To Market Phenomena, Academic Press, New York, 2000.
12. O. Biham et al., Generic emergence of power law distributions and L\'evy-
stable intermittent fluctuations in discrete logistic systems, Phys. Rev. E
58, 1352 (1998).
13. U. Hershberg, et al., HIV time hierarchy: winning the war while, loosing all
the battles, Physica A: 289 (1-2) (2001) pp.178-190;
14. N. Rubin, et al. Restricted Ability to Recover 3D Global Motion
from 1 D Motion Signals: Theoretical Observations, Vision Research 35
15. Ester Adi-Japha, et al. Emergence of Representation in Drawing: The
Relation Between Kinematic and Referential Aspects, Cognitive
Development 13, 25-51 (1998)
16. J. Goldenberg, et al., Creative Sparks, Science 285: 1495-1496, 1999.