Complexity science

The complexity and proliferation of data from the real world has overwhelmed our abilities to compute it. We need to think again.

How do we model the growth of an organism at cell level? Or predict the interaction of several atoms in a molecule? On a much larger scale, how do we model the effects of climate change? Or predict the growth patterns of social networks on the internet?

Although these questions appear to be very far apart in terms of scale, they share a common attribute: complexity.

Conventional mathematics – as developed over the last two millennia – is good at dealing with ‘simple’ problems, where the types of object or agent are uniform or systems operate in a state of equilibrium. While these mathematical tools have served us well and form the basis of modern engineering, chemistry, biology and economics, their broad generalisations are no longer sufficient. We need to predict precise outcomes in far more complex and dynamic systems.

The difficulty of many problems of interaction rises exponentially as the number of agents increases. While we can formulate algorithms to solve simple cases, in real life the number of agents is often too numerous for the most powerful supercomputers available to calculate even a single outcome in an acceptable time period.

“We can model a deterministic system but the smallest perturbation in the initial conditions creates a huge distribution of end results,” says Professor Noah Linden, of Bristol University mathematics department.

Areas such as genetic research and climate modelling have led to an explosion of available data. Added to the number of agents and possible interactions, this data has outstripped the ability of our existing tools to analyse the myriad possible outcomes. We need to develop new mathematics and computational algorithms if we are to make sense of the real complex world.

In response to these challenges, the EPSRC (Engineering and Physical Sciences Research Council) has awarded Bristol University a £4m grant to launch the Bristol Centre for Complexity Sciences which started its four-year graduate program in the complexity sciences in 2007.

Bristol University has appointed two lecturers, Karoline Wiesner and AJ Ganesh, and will appoint two more in 2008. The first cohort of 12 graduate students began the course in October 2007. The one-year course consists of teaching modules in a broad range of subjects, such as nonlinear dynamics, information theory, nanotechnology, and protein science, ending with two three-month research projects. After that the students choose their PhD topic.

“Everything is made of atoms, but when is it appropriate to describe things as atoms (as we do in physics) and when as chemical compounds and reactions (as in chemistry) and when as living organisms (as in biology)?” asks lecturer in complexity sciences, Karoline Wiesner. “Can you find a measure of complexity that tells you where the borders are between these descriptions? Can you quantify the stepwise increase in complexity and approximation? Complexity science is about finding that right level of simplification to describe the system.”

Wiesner's approach to complexity stems from information theory. “One way of describing, say, a molecule, is to look at the information it stores rather than at its atomic energy levels,” says Wiesner. “For example, does the molecule store and/or process information when it transforms from one state to another? Does the state that it is in now contain information about what state it was in before it changed? If so, so must the computation-theoretic representation of the molecule. This is about putting the model on the same informational level as the molecule.”

AJ Ganesh approaches complexity from network theory. “Large-scale communications networks are ‘small worlds’: any one agent knows only a small number of neighbouring agents but all agents are linked to each other through a short chain of intermediaries. Thus very small changes in one part of the network, such as infection from a virus or worm, can have huge ramifications in the network as a whole,” says Ganesh.

“However, the influence of network parameters is not smooth or linear; rather, there appear to be phase transitions for example, like the change from ice to liquid water. The mathematics of complexity is required to find out what those transition points are.”

“While the topology imposed on a network can affect the interaction of its agents, what if the network evolves without central command? How does the interaction of its agents affect the shape of the network?” asks Ganesh.

The new mathematical models to be developed aim to uncover the patterns and transitions involved and thus answer these questions.

Complexity theory and the internet

This image represents connectivity in a portion of the internet. Very small changes in one part of the network, such as infection from a virus or worm, can have huge ramifications in the network as a whole.

However, the influence of network parameters is not smooth or linear; rather, there appear to be phase transitions, like the change from ice to liquid water. The mathematics of complexity is required to find out what those transition points are.

Internet connectivity map. Image created by Matt Grint and available under Creative Commons Attribution 2.5 License

The study of complexity

There are very few analytical tools for dealing with complex systems. We need to invent new mathematics.

The study of complexity requires a diverse approach and a collaboration of the disciplines. Karoline Wiesner, for example, has come to it from molecular physics.

The mathematical tools are not only limited to communications networks. They will also be applicable for predicting the spread and decline of epidemics and the interaction of economic agents.

Edit this page