Here we provide a brief overview of the main research areas in the School of Mathematics. Note that many members of staff have interests spanning several different areas, which highlights the numerous connections between some of the areas listed below.
Algebra | Analysis | Applied Probability | Bayesian Modelling and Analysis | Combinatorics | Complexity | Ergodic Theory, Dynamical Systems and Statistical Mechanics | Fluid Dynamics | Logic and Set Theory | Material Science, PDEs, Variational Problems and Applications | Monte Carlo Computation | Multiscale Methods | Nonparametric Regression | Number Theory | Numerical Methods | Optimisation under Uncertainty | Quantum Chaos | Quantum Computation and Quantum Information Theory | Random Matrix Theory | Scaling Limits | Statistical Bioinformatics | Statistical Physics | Statistical Signal Processing | Time Series
The Algebra group focusses on group theory and representation theory. Groups are algebraic structures that arise naturally throughout mathematics. They encode the symmetries in a vast range of mathematical and physical systems, and group theory provides a powerful and unified language for studying these symmetries. Current areas of research in Bristol include finite and algebraic groups, simple groups and geometric group theory. Representation theory, in its broadest sense, is the art of relating the symmetries of different objects. It is a vast subject enjoying a close relationship with topology, geometry, number theory, combinatorics and mathematical physics.
A main strand of research in the Analysis group is the interplay between the spectrum of the Laplace operator with self-adjoint boundary conditions and the geometry of the manifold or domain in Euclidean space. Various tools such as the theory of function spaces, calculus of variations and shape optimization techniques play a key role. A closely related area is the stochastic analysis of Brownian motion, and quantitative properties of solutions of the heat equation both on Riemannian manifolds and in domains in Euclidean space. Other research themes in the group include the theory of function spaces and geometric function theory. Geometric function theory is concerned with how infinitesimal properties of functions, such as conformality, have large-scale geometric consequences. All these topics have links to many other groups at Bristol, such as Ergodic Theory and Dynamical Systems, Mathematical Physics, and Group Theory.
Real-world applications, such as queueing systems, communication networks and financial markets, evolve in a random fashion over time. Research in applied probability provides insight into these processes that exhibit randomness using results from the theory of probability, so that they can be mathematically modelled and better understood. These processes can model real world systems that evolve in a random fashion over time, for example:, queuing systems, communication networks and financial markets. Advances in stochastic approximation, non-Markovian random walks, stochastic control and such techniques have led to developments in other topics, including Monte Carlo Computation and Optimisation under Uncertainty. See also the Centre for Doctoral Training in Communications, which is one of our Interdisciplinary PhD programmes.
Bayesian Modelling and Analysis
Research into Bayesian methods includes work on both generic issues, such as model selection, graphical models and default priors, and on application-specific models and methods in a wide variety of domains, including genetic epidemiology, econometrics, hydrology and traffic management. The key characteristic of Bayesian methods is that all variables data, parameters, latent variables etc are treated as random and all uncertainties are expressed as probabilities. This gives Bayesian analysis an attractive uniformity and coherence. Inferential tasks such as hypothesis testing and the construction of confidence intervals, which have to be performed indirectly in classical inference, can be replaced by direct probability statements about unknowns. Bayesian methods are used in many different situations, such as traffic management, genetic epidemiology, econometrics and hydrology, as well as generic issues such as model selection, graphical models and default priors. Inherently, Bayesian analysis is suited to complexity, prediction, sequential updating of information and nuisance parameters. Developments in Monte Carlo Computation have dramatically eased the computational challenge of implementing Bayesian methods, particularly in complex models.
While combinatorial structures are investigated in their own right with great success, problems of a combinatorial nature arise in many other areas of pure mathematics, notably in group theory, probability, topology, geometry and number theory. Bristol has particular strengths in advancing interactions of combinatorics with the latter two areas, specifically in the study of incidence geometry/geometric combinatorics and Szemereditype problems concerning arithmetic structures in subsets of the integers and the primes.
Complexity is a multidisciplinary subject linking mathematics, statistics and computer science with application areas in engineering, life and molecular sciences. The School of Mathematics has ongoing collaborations with staff in the School of Biological Sciences, Engineering Mathematics and Computer Science. There are also strong links with the Bristol Centre for Complexity Sciences.
Ergodic Theory, Dynamical Systems and Statistical Mechanics
Ergodic theory is a branch of pure mathematics that investigates the statistical properties of dynamical systems. The time evolution of even very simple systems can be completely unpredictable, and one of the key objectives of ergodic theory is to identify and classify measures that are invariant under the time evolution, thus allowing deep insights in the structure of the dynamics. Simple examples of chaotic dynamical systems include geodesic flows on negatively curved surfaces, and billiard tables with convex scatterers (Sinai billiards). Ergodic theory has provided powerful tools to solve some outstanding problems in other research fields, e.g., in number theory, combinatorics, quantum chaos and statistical physics. Indeed, most physical problems can be viewed as a dynamical system. Typically this involves studying the solution structure of nonlinear equations, understanding how these solutions may vary as the dynamical system changes and discerning generic properties of the solutions, for example, will they exhibit chaotic behaviour. There are connections to the Quantum Chaos research area.
Fluid Dynamics describes a tremendous variety of phenomena from the large scale (e.g. weather and ocean systems on Earth and other planets or stars) through medium scales (e.g. the flow around Grand Prix racing cars) to very small scale (e.g. microdroplets for drug delivery). Our main aim is to understand how the nonlinear character of the hydro-dynamic equation leads to the wealth of flow properties observed. These can range from the formation of novel flow structures, in particular those covering many length scales, to fully turbulent flows. We are also pushing beyond the boundaries of classical hydrodynamics by studying polymeric fluids, granular media and flows on the nanoscale. This places our group at a junction between mathematics, physics, chemistry, engineering and geophysics. Our in-house fluid dynamics laboratory keeps us closely connected to the ‘real world’, and presents us with ever new theoretical challenges. Traditional research into fluid dynamics at Bristol involves free surface flows, especially water waves (Dr Richard Porter), and turbulence and transition to turbulence (Prof Rich Kerswell). More recently, it expanded to include granular media and article laden flows and complex fluids (Dr Andrew Hogg; Prof Rich Kerswell, FRS; Prof Jens Eggers) and vertical flows (Prof Rich Kerswell).
Logic and Set Theory
Set Theory can be regarded as a foundation for all mathematics. Nevertheless many mathematical problems of a foundational interest remain quite unresolved: the Continuum Hypothesis is only one example. Current set theory provides direct applications in many areas of mathematics particularly of the analytical kind: Banach space theory, C ∗ algebras, as well as Lebesgue measure theory (which sets can be Lebesgue measurable?). Current interests of the group in Bristol include: (i) the interaction between models of set theory, determinacy of perfect information games and analysis; (ii) the interaction between set theory and the category theory of algebraic topology; (iii) on the mathematical logical side: the proof theory of set theory’s axioms when augmented by truth predicates.
Material Science, PDEs, Variational Problems and Applications
Recent years have seen intense interaction between mathematics and materials science, including solid mechanics and liquid crystals. This has borne much fruit in solid mechanics, the explanation of intriguing material behaviour (e.g., the shape memory effects) by mathematical models that relate behaviour to microstructure; in liquid crystals, models relating static and dynamic properties to the existence and regularity of harmonic maps, possibly with defects, between topologically nontrivial spaces. This successful interaction has in turn raised a number of questions, many of which are of interest simultaneously in mathematics, in the physical sciences and in engineering. The relevant mathematical areas are primarily, but not exclusively, calculus of variations, partial differential equations, functional and real analysis and topology. Specific research problems include the following. Solid mechanics: microstructure evolution in solids, homogenization of (polycrystalline) materials with degenerate energy, computation of quasiconvex hulls and morphology formation in biological tissues as a result of stresses induced by growth. Liquid crystals: topological classification, energy bounds, for nematics in polyhedral geometries with natural, e.g. tangent, boundary conditions; number of smooth solutions, regularity of weak solutions; switching mechanisms, applications to bistable display technology.
Monte Carlo Computation
Monte Carlo methods are simulation algorithms designed to compute answers to deterministic questions using random numbers. Although used in many branches of science, in statistics they are principally used to compute probabilities and expectations in complex stochastic models. Markov chain Monte Carlo (MCMC) techniques are dynamic simulation methods where variables are updated iteratively in a stationary way, and whose use in computation depends on convergence theorems for Markov chains. These methods have proved remarkably important in implementing Bayesian methods, especially in complex models. Research in the group is focussed on several key areas of Monte Carlo methodology, including adaptive MCMC, particle filters, trans-dimensional MCMC and simulated annealing. It addresses both methodological issues (construction of algorithms) and theoretical aspects (proof of convergence, quantifying performance). Monte Carlo methods were imported to the discipline of statistics from physics. In modern terms they originate from Los Alamos and the atomic bomb project, although there is reference to them as far back as the ancient Babylonians of Biblical times. Now they are applied across many scientific fields, including engineering, aerospace, image and speech recognition and robot navigation.
In recent years multiscale methods have revolutionised the modelling and analysis of phenomena in a number of different disciplines. The ’multiscale’ paradigm typically involves a multiscale representation and then manipulation of that representation to achieve a desired goal. Practical applications include: modelling communications network traffic - such as queues on routers - and image compression. The JPEG 2000 image standard is based on wavelet compression. So is the FBI fingerprint database.
Given noisy data observed at certain intervals, the aim is to approximate the data by a function without restricting ourselves to functions from a small family like linear or polynomial models. Smoothness or simplicity assumptions are made instead. Many methods have been suggested and studied, the most popular ones are kernel estimators, spline smoothing, local polynomial regression and wavelet thresholding. Local extreme values play an important role in many applications of nonparametric statistics because their positions have often meaningful interpretations. So recent methods based on minimising total variation, like the taut string method, try to fit the data with a function that contains local extreme values only at positions where indicated by the data. Practical applications of nonparametric regression include image decompression and signal cleaning, and general problems of dealing with missing data.
The number theory group at Bristol is one of the largest in the UK, and its interests span elliptic curves, computational number theory, quantitative arithmetic geometry, quadratic forms, L-functions and the Riemann zeta function, modular and automorphic forms, Diophantine approximation, applications of the Hardy-Littlewood (circle) method and arithmetic combinatorics. The investigation of these topics draws on tools from algebraic geometry, combinatorics, dynamical systems, harmonic analysis, mathematical physics, random matrix theory and representation theory.
The mathematical modelling of nonlinear phenomena sometime leads to differential equations that are too difficult to solve by known analytical methods. In such cases, numerical methods can provide much insight into the properties of the solution set. Two threads of research are represented in the group: one where the use of numerical techniques is motivated by particular applications, and another where the focus is on the theoretical study of the effectiveness of the numerical methods themselves.
Staff in this area: Kerswell, Prof. Rich
Optimisation under Uncertainty
Optimisation under uncertainty covers a broad framework of problems at the interface of applied probability and optimisation. The main focus of work is on Markov decision processes, game theory, reinforcement learning and multi-agent systems. The underlying aim is to use a combination of models, techniques and theory from stochastic control, equilibrium selection and learning to determine behaviour that is optimal with regard to some given reward structure, for example to problems in behavioural biology. Markov decision processes describe a class of single decision-maker optimisation problems that arise when applied probability models (e.g. Markov chains) are extended to allow for action-dependent transition distributions and associated rewards. Game Theory problems are more complex in that they involve two or more decision makers (players), so the optimal action for each player will depend on the actions of other players. Here, interest focuses on Nash equilibria - strategies that are conditionally optimal in the sense that no player can do better by changing their strategy while other players stay with their current strategy. The problem is even more complicated when the transition probabilities or expected rewards are not fully known or the actions of the other players are not fully observable. Reinforcement learning algorithms use simulation based techniques to “learn” the appropriate optimal or equilibrium behaviour. More generally, multi-agent systems address problems where, for example, decision makers are essentially distributed in time or space, and each single agent has only partial information about the process. Now the objective is to find ways of collaborating that will enable the agents to reach optimal or near-optimal solutions. Current research in the group includes: Decentralised control and optimisation, Nonstandard problems and algorithms for fluctuating environments, Convergence of reinforcement learning algorithms for game theory, Applications in behavioural biology and the optimal control of queues.
Quantum Mechanics, the theory of matter on small scales, plays a centrally important role in many of the most important areas of science and technology (e.g. lasers, mesoscopic and nanoscopic systems). However, few quantum systems can be solved analytically. For the rest, methods of approximation are required. Among these, asymptotic methods based on classical (Newtonian) mechanics are of increasing importance, especially in mesoscopic and nanoscopic systems, which lie at the boundary between the classical and quantum worlds. Within classical mechanics there is a broad spectrum of qualitatively different dynamics, ranging from integrable (completely regular) to strongly chaotic (highly irregular). Quantum chaos is the area of research concerned with how this fact manifests itself in quantum mechanics. It is an exciting and rapidly developing field, encompassing the mathematical analysis of new quantum phenomena and a wide variety of applications in many areas of science and technology (e.g in nanoscale systems and microlasers). There are deep connections with Random Matrix Theory - the study of the statistical distribution of the eigenvalues of matrices picked at random from some suitably defined ensemble - Ergodic Theory, and several areas of Number Theory, such as the theory of the Riemann zeta function and other related objects. Many fundamental developments in the subject have followed from work carried out here in Bristol. There is a close relationship with the Dynamical Systems and Quantum Information areas, and with the group in Physics led by Prof Sir Michael Berry FRS. There is also a close connection with the Number Theory and Ergodic Theory research areas.
Staff in this area: Dettmann, Prof. Carl; Keating, Prof. Jon; Marklof, Prof. Jens; Mezzadri, Prof. Francesco; Muller, Dr. Sebsatian; Robbins, Prof. Jonathan; Schubert, Dr. Roman; Sieber, Dr. Martin; Snaith, Dr. Nina; Tourigny, Dr. Yves.
Quantum Computation and Quantum Information Theory
Recently the new subjects of quantum computation and quantum information theory have emerged which both offer the potential for immense practical computing power and also suggest deep links between the well-established disciplines of quantum theory and information theory and computation. On the one hand computer chips will soon be so small that we will have to grapple with the fact that electrons inside the processing elements become ‘smeared out’, for example they can tunnel out of the wires; Heisenberg’s Uncertainty Principle seems to be at odds with the desire for reliable computation. On the other hand, it has been realised very recently that one might be able to take advantage of intrinsically quantum features to build quite new types of computers ‘quantum computers’. We are only just beginning to understand what quantum information is and what quantum computer can do. We have close links with the physics and computer science departments and our group is interested in all aspects of quantum information theory (foundations, non-locality, entanglement, quantum Shannon theory, quantum computational models, and applications of ideas from quantum information to other fields such as statistical mechanics), and experiment (quantum key distribution, quantum photonics). The Quantum Computation and Quantum Information Theory group may advertise additional PhD and other positions on the web.
Random Matrix Theory
Random matrices are often used to study the statistical properties of systems whose detailed mathematical description is either not known or too complicated to allow any kind of successful approach. It is a remarkable fact that predictions made using random matrix theory have turned out to be accurate in a wide range of fields: statistical mechanics, quantum chaos, nuclear physics, number theory, combinatorics, wireless telecommunications, quantum field theory and structural dynamics, to name only few examples. One of the main reasons for this fascinating modelling power is that as the dimensions of the matrices tend to infinity the local statistical properties of the eigenvalues become independent of the probability distribution on the given matrix space. This important and long conjectured property of random matrices was proved only recently. This is a fast developing field of research. Several applications of random matrices are studied in Bristol, including quantum transport, quantum chaos, quantum information, number theory as well the universal properties of random matrices.
Random motion of a single particle or individual governed by simple stochastic rules is well understood by classical probability theory. The picture starts to be very different if the rules of motion are more complicated or we study a population of individuals that interact. Examples include random processes with memory, where the random behaviour of the particle is influenced by the particle’s own history, or interacting particle systems, where many simple walkers interact with each other, and various models arising in population biology and ecology where individuals interact with each other as well as possibly the external environment. Many real life systems can be studied using scaling limits, e.g. crystal growth, spread of infectious diseases, traffic jams, forest fires, spread of advantageous mutations. Mathematical theory often reveals unexpected behaviour. It is often difficult if not impossible to do calculations for these models, but scaling the parameters may be helpful in identifying limiting behaviour of these systems, about which we may be able to say something interesting. These limits can be in space or time, or both. Often, taking scaling limits connect many fields in mathematics, such as analysis, combinatorics, complex function theory, and partial differential equations, and is challenging and interesting in itself. To be able to quantify the phenomenon in some way, which may not be possible otherwise, is an added bonus.
Statistical bioinformatics stands at the junction of biology and statistics, with input from mathematics and computer science, and is an interdisciplinary effort in which statisticians are responsible for modelling and data analysis, and biologists generate questions, and provide scientific knowledge and interpretation. Practical applications include protein sequencing, discovering gene regulatory networks, and causal inference in molecular epidemiology. Huge quantities of data are becoming available, potentially of great value in aiding scientific understanding and promoting prevention and cure of disease. However, data of these kinds show very complex patterns of variation, from a variety of sources, both biological and technical, and there are therefore fascinating challenges for statisticians wishing to contribute in this area. The main current interests in bioinformatics within the group are in gene expression data, protein matching and alignment, macromolecular structure modelling and some modelling issues in genetic epidemiology.
Interesting large scale behaviour can arise from short range local interactions between the constituent components of interacting particle systems. This can manifest as long range correlations and dependencies, as well as more spectacular phenomena such as a phase transitions, where internal symmetries break down. The study of probabilistic aspects of models of classical and quantum statistical physics involves many interesting topics, such as: phase transitions, fluctuations in the systems close to criticality or far from equilibrium, time dependent behaviour, diffusion in an environment that is spatially inhomogeneous but its statistical properties are translation invariant. They can lead to extremely complex and difficult mathematics. To aid the study of these problems, we sometimes derive partial differential equations governing the macroscopic motion of the system from microscopic principles.
Statistical Signal Processing
Uncertainty is present in various forms in numerous information engineering activities, for example, telecommunications, target tracking, sensor data fusion, signal and image processing. The present interdisciplinary research program at Bristol bridges the Statistics group and the Department of Electrical and Electronic Engineering by promoting the transfer of modern statistical methodology to the area of signal processing using the tools in concrete applications. Particular interests include approximate inference in large-scale statistical models, applications to communication and coding, machine learning, wavelet methods for data fusion, distributed computations, vesicle tracking in biological image processing and multiscale network visualisation.
Time series are observations on a variable ordered in time. They arise in many fields, including biology, telecommunications, physics, finance or economics. One example from the world of finance is daily quotes of share indices, such as FTSE 100. Time series analysis is a branch of statistics whose main aims are: (a) to find a model which provides a good description of the main features of the data, and (b) given the model and the data, to forecast and/or control the future evolution of the process. These two stages of analysis often require the development of novel procedures and algorithms which depend on the particular problem at hand. Time series are used to analyse bodies of data over time, e.g., stock prices, environmental data, etc. Applications include such problems as predicting volcano eruptions.