Here we provide a brief overview of the main research areas in the School of Mathematics. Note that many members of staff have interests spanning several different areas, which highlights the numerous connections between some of the areas listed below.
Algebra | Analysis and Geometry | Applied Probability | Bayesian Modelling and Analysis | Combinatorial Algebraic Geometry | Combinatorics | Complexity Science | Ergodic Theory, Dynamical Systems and Statistical Mechanics | Fluid Dynamics | High-Dimensional and Highly-Structured Data| Logic and Set Theory | Material Science, PDEs, Variational Problems and Applications | Modern Regression Methods | Monte Carlo Computation | Multiscale Methods | Network Statistics | Nonparametric Regression | Number Theory | Optimisation under Uncertainty | Quantum Chaos | Quantum Computation and Quantum Information Theory | Random Matrix Theory | Scaling Limits | Soft and Biologicial Matter | Statistical Genetics| Statistical Physics | Statistical Signal Processing | Time Series Analysis
The Algebra group focusses on group theory and representation theory. Groups are algebraic structures that arise naturally throughout mathematics. They encode the symmetries in a vast range of mathematical and physical systems, and group theory provides a powerful and unified language for studying these symmetries. Current areas of research in Bristol include finite and algebraic groups, simple groups and geometric group theory. Representation theory, in its broadest sense, is the art of relating the symmetries of different objects. It is a vast subject enjoying a close relationship with topology, geometry, number theory, combinatorics and mathematical physics.
Analysis and Geometry
The main research themes in the Analysis and Geometry group are (i) Spectral Geometry, (ii) Geometric Function Theory, (iii) Hyperbolic Geometry and Low Dimensional Topology. Spectral Geometry studies the interplay between the spectrum of elliptic operators, such as the Laplace operator, and the geometry and topology of the manifold or domain in Euclidean space. Various tools such as functional analysis, calculus of variations, harmonic analysis, comparison geometry and shape optimization techniques play a key role. Geometric function theory is concerned with how infinitesimal properties of functions, such as conformality, have large-scale geometric consequences. Hyperbolic Geometry and Low Dimensional Topology is the branch of topology and geometry that studies surfaces, the mapping class group, and Teichmuller theory, i.e. the space of marked hyperbolic surfaces. All these topics have links to each other and also to many other research areas at Bristol, such as Ergodic Theory and Dynamical Systems, Geometric Group Theory, Quantum Chaos, Probability and Number Theory.
Real-world applications, such as queueing systems, communication networks and financial markets, evolve in a random fashion over time. Research in applied probability provides insight into these processes that exhibit randomness using results from the theory of probability, so that they can be mathematically modelled and better understood. These processes can model real world systems that evolve in a random fashion over time, for example:, queuing systems, communication networks and financial markets. Advances in stochastic approximation, non-Markovian random walks, stochastic control and such techniques have led to developments in other topics, including Monte Carlo Computation and Optimisation under Uncertainty.
Bayesian Modelling and Analysis
In the Bayesian approach to statistics, observables, predictands, and model parameters are all treated as random variables, which allows the observations to be incorporated by conditioning, and probabilistic predictions to be made by integrating out the model parameters. This powerful unifying framework has become far more accessible in the last twenty years, owing to improvements in computer power, and in stochastic algorithms for conditioning and integrating, particularly Markov Chain Monte Carlo (MCMC) algorithms. This framework also allows us to develop more complex statistical models, suitable for modern high-dimensional and highly-structured data. It is not uncommon for a Bayesian model to have hundreds or even thousands of parameters, but the effective number of parameters, which is data-determined, can be far fewer.
Bayesian methods are now mainstream in Machine Learning, and they are also widely used in more complex applications such as signal processing, target-tracking, protein folding, and genetic epidemiology, and in many applications involving latent processes, including spatial statistics. Bayesian decision theory is crucial in the development of transparent early warning systems, for example for extreme weather, or for volcanic eruptions.
Current Bayesian applications in the School include global-scale spatial statistics for measuring and predicting sea-level rise, and spatio-temporal modelling of long term daily air-pollution data; analysis of ancestry in population genetics. Theoretical work on Bayesian analysis includes high-dimensional sparse computation, visualisation for model-checking, Bayesian asymptotics, asymptotic approximation in inverse problems.
Combinatorial Algebraic Geometry
Combinatorial algebraic geometry is the study of varieties with combinatorial structures which can be initially given, or obtained as degeneration of algebraic varieties. Therefore, the study of these objects encompasses toric and tropical geometry, which have proven to be ubiquitously occurring in algebraic geometry, commutative algebra, representation theory, mathematical physics, and many other fields. The research interests of our group in Bristol includes tropical and toric geometry as well as matroid theory, combinatorial Hodge theory, and degenerations of classical algebraic varieties to toric or non-Archimedean varieties or even to analytic objects.
While combinatorial structures are often investigated for their intrinsic beauty, problems of a combinatorial nature arise in many other areas of pure mathematics, notably in group theory, probability, topology, algebraic geometry, geometry and number theory. Bristol has particular strengths in advancing interactions of combinatorics with the latter three areas, specifically in the study of algebraic varieties through their combinatorial counterparts like matroids, hyperplane arrangements, polytopes and lattices; incidence geometry and geometric combinatorics; and Szemeredi-type problems concerning arithmetic structures in subsets of the integers and the primes.
Complexity is a multidisciplinary subject linking mathematics, statistics and computer science with application areas in engineering, life and molecular sciences, and social sciences. Some of the challenges are in the development and application of mathematical tools, including in complex network theory, information theory, statistical mechanics, and others. The School of Mathematics has ongoing collaborations with staff in the School of Biological Sciences, Engineering Mathematics, Chemistry, Computer Science and Economics, as well as Philosophy.
Ergodic Theory, Dynamical Systems and Statistical Mechanics
Ergodic theory is a branch of pure mathematics that investigates the statistical properties of dynamical systems. The time evolution of even very simple systems can be completely unpredictable, and one of the key objectives of ergodic theory is to identify and classify measures that are invariant under the time evolution, thus allowing deep insights in the structure of the dynamics. Simple examples of chaotic dynamical systems include geodesic flows on negatively curved surfaces, and billiard tables with convex scatterers (Sinai billiards). Ergodic theory has provided powerful tools to solve some outstanding problems in other research fields, e.g., in number theory, combinatorics, quantum chaos and statistical physics. Indeed, most physical problems can be viewed as a dynamical system. Typically this involves studying the solution structure of nonlinear equations, understanding how these solutions may vary as the dynamical system changes and discerning generic properties of the solutions, for example, will they exhibit chaotic behaviour. There are connections to the Quantum Chaos research area.
Fluid Dynamics describes a tremendous variety of phenomena from the large scale (e.g. weather and ocean systems on Earth and other planets or stars) through medium scales (e.g. the flow around Grand Prix racing cars) to very small scale (e.g. microdroplets for drug delivery). Our main aim is to understand how the nonlinear character of the hydrodynamic equation leads to the wealth of flow properties observed. These can range from the formation of novel flow structures, such as the pinch-off of a fluid drop, so large-scale, turbulent flows. We are also pushing beyond the boundaries of classical hydrodynamics by studying polymeric fluids, granular media and flows on the nanoscale. Our work on active fluids reveals how motion is created in biological systems. These activities place our group at a junction between mathematics, physics, chemistry, biology, engineering and geophysics.
High-Dimensional and Highly-Structured Data
High-dimensional statistics studies data whose dimension is larger than those treated in classical statistics theory. There has been a dramatic surge of interest and activity in high-dimensional statistics over the past two decades, due to new applications and datasets, and theoretical advances. With high-dimensional data, statistical activities such as variable selection, estimation, and hypothesis testing must scale conservatively with the number of cases, and this rules out many traditional statistical approaches. Theoretical work in high-dimensional data studies sequential, iterative, and approximation approaches, to establish whether they scale conservatively, and what their statistical properties are.
Many high-dimensional datasets are also highly-structured: relational data, for example, which are ubiquitous across a wide range of application areas including public health, life science, social science, and finance, and of course social media. The mathematical representation of a relational network is a graph, comprising vertices (sometimes called ‘nodes’) and edges between vertices. For example, representing brain voxels and their connectivity, to understand brain structure. Or, in epidemiology, representing individuals and their contacts, to help policy-makers to mitigate harm and also to manage outbreaks. Or, in cyber-security, representing a computer network in order to monitor it for suspicious changes in behaviour. In some applications, such as cyber-security, the graph is pre-specified and interest lies in how graph-structured data evolves in time. In other applications, such as neuroscience, the graph itself must be inferred from data such as brain scans. Inferring and exploring graphs involves a ‘combinatorial explosion’ of edges and paths, and many key operations on graphs are known to be NP hard (computationally intractable), and must be approximated.
Current research in the School focuses on developing methods for uncovering the sparse/low-rank structure in high-dimensional data, such as factor analysis of high-dimensional panel collecting financial indicators and price processes, and Bayesian methods for structural learning, inference about relationships from DNA mixtures in forensic statistics.
Logic and Set Theory
Set Theory can be regarded as a foundation for all mathematics. Nevertheless many mathematical problems of a foundational interest remain quite unresolved: the Continuum Hypothesis is only one example. Current set theory provides direct applications in many areas of mathematics particularly of the analytical kind: Banach space theory, C ∗ algebras, as well as Lebesgue measure theory (which sets can be Lebesgue measurable?). Current interests of the group in Bristol include: (i) the interaction between models of set theory, determinacy of perfect information games and analysis; (ii) the interaction between set theory and the category theory of algebraic topology; (iii) on the mathematical logical side: the proof theory of set theory’s axioms when augmented by truth predicates.
Material Science, PDEs, Variational Problems and Applications
Recent years have seen intense interaction between mathematics and materials science, including solid mechanics and liquid crystals. This has borne much fruit in solid mechanics, the explanation of intriguing material behaviour (e.g., the shape memory effects) by mathematical models that relate behaviour to microstructure; in liquid crystals, models relating static and dynamic properties to the existence and regularity of harmonic maps, possibly with defects, between topologically nontrivial spaces. This successful interaction has in turn raised a number of questions, many of which are of interest simultaneously in mathematics, in the physical sciences and in engineering. The relevant mathematical areas are primarily, but not exclusively, calculus of variations, partial differential equations, functional and real analysis and topology. Specific research problems include the following. Solid mechanics: microstructure evolution in solids, homogenization of (polycrystalline) materials with degenerate energy, computation of quasiconvex hulls and morphology formation in biological tissues as a result of stresses induced by growth. Liquid crystals: topological classification, energy bounds, for nematics in polyhedral geometries with natural, e.g. tangent, boundary conditions; number of smooth solutions, regularity of weak solutions; switching mechanisms, applications to bistable display technology.
Modern Regression Methods
The aim of regression modelling is to determine how the distribution of a noisy response variable depends on one or more independent variables, or ‘covariates’. Despite having been around for more than two hundred years, regression modelling is a very active and fast-paced research area. Modern regression methods are not limited to a continuous response, or to modelling the conditional mean of the response (Generalized Linear Models, quantile regression). They can also include a wide variety of non-linear covariate effects, which are constructed using basis expansions or stochastic processes (Generalized Additive Models, GAMs). This makes these regression models much more adaptable to the empirical relationship between the covariates and the response, although it also introduces the danger of over-fitting, which tends to undermine predictive performance: controlling for overfitting is a major topic in regression modelling.
From a practical point of view, the most pressing challenge in regression modelling is developing estimation methods that can handle large datasets, including techniques such as sequential learning and parallel computing. Machine Learning (ML) is a major user of modern regression methods, and ML research is a productive source of new algorithms for modern regression: modern regression is in the intersection of ML and computational statistics. Current research in the School includes developing theory and more efficient computational methods for GAMs; well founded methods for smooth additive quantile regression; scalable computation for smooth regression models in general; efficient INLA methods for non-sparse models; big model/data visualization; controlling spatial confounding in complex regression models.
Monte Carlo Computation
Monte Carlo methods are simulation algorithms designed to compute answers to deterministic questions using random numbers. They are used in statistics principally to compute probabilities and expectations in complex stochastic models, and are at the origin of the ever-increasing popularity of Bayesian methods. Monte Carlo methods were first imported into Statistics from Physics: they originated in Los Alamos and the atomic bomb project, although there is reference to them as far back as the ancient Babylonians of Biblical times. Now they are applied across many scientific fields, including engineering, aerospace, image and speech recognition and robot navigation.
Modern data structures, like streaming data and high dimensional datasets, are challenging for traditional Monte Carlo methods such as the well-known Metropolis-
Hastings algorithm and the Gibbs sampler (both based on reversibility), because they cannot efficiently take advantage of the growing parallel computing power of modern computers. Current research in the School includes non-reversible and continuous time Markov chain Monte Carlo methods, distributed particles filters and stochastic optimisation algorithms.
In recent years multiscale methods have revolutionised the modelling and analysis of phenomena in a number of different disciplines. The ’multiscale’ paradigm typically involves a multiscale representation, and then manipulation of that representation to achieve a desired goal. Practical applications include: modelling communications network traffic - such as queues on routers - and image compression. The JPEG 2000 image standard is based on wavelet compression, as is the FBI fingerprint database. In addition, wavelets have the ability to sparsify systems and transformed previously challenging problems into ones which admit elegant solutions, using methods of high-dimensional mathematics and statistics.
Multiscale method are often required in large-scale spatial statistics, where it is necessary to merge observational datasets having very different spatial footprints. For example, GPS measurements made at a single point, LIDAR measurements made along a transect, and GRACE satellite measurements which average over an area of hundreds of square kilometres. These observational datasets are crucial in determining current and future sea levels, and understanding the impact of climate change.
Current research in the School includes ‘lifting’: using second-generation wavelets to tackle more realistic problems, where data are not uniformly spaced or arise on some complex manifold. Lifting methods provide computationally efficient methods of producing approximate wavelet coefficients, which share most of the attractive properties of first-generation wavelets, including sparsity, efficiency and the ability to manipulate objects and systems at multiple scales. Also hierarchical multi-resolution methods (e.g. in the context of electricity demand forecasting using smart meter data).
Networks are fundamental tools for representing relational data, which are ubiquitous across a wide range of application areas including public health, life science, social science and finance, to name but a few. For instance in neuroscience, representing brain voxels and the functional connectivities in between as nodes and edges in networks facilitates the study of brain structures; in epidemiology, symbolising individuals and disease transmission incidences as nodes and edges enables decision makers to endorse effective policies. This abstraction calls for powerful mathematical tools, among which, statistics shines out when data sets are of massive scale. Research in this area includes developing theoretically-sound and computationally-efficient methods for network estimating and inference.
Given noisy data observed at certain intervals, the aim is to approximate the data by a function without restricting ourselves to functions from a small family like linear or polynomial models. Smoothness or simplicity assumptions are made instead. Many methods have been suggested and studied, the most popular ones are kernel estimators, spline smoothing, local polynomial regression and wavelet thresholding. Local extreme values play an important role in many applications of nonparametric statistics because their positions have often meaningful interpretations. So recent methods based on minimising total variation, like the taut string method, try to fit the data with a function that contains local extreme values only at positions where indicated by the data. Practical applications of nonparametric regression include image decompression and signal cleaning, and general problems of dealing with missing data.
The number theory group at Bristol is one of the largest in the UK, and its interests span elliptic curves, computational number theory, quantitative arithmetic geometry, quadratic forms, L-functions and the Riemann zeta function, modular and automorphic forms, Diophantine approximation, applications of the Hardy-Littlewood (circle) method and arithmetic combinatorics. The investigation of these topics draws on tools from algebraic geometry, combinatorics, dynamical systems, harmonic analysis, mathematical physics, random matrix theory and representation theory.
Staff in this area: Bober, Dr. Jonathan; Booker, Dr. Andrew; Browning, Prof. Tim; Conrey, Prof. Brian; Dokchitser, Prof. Tim; Gorodnik, Prof. Alexander; Keating, Prof. Jon; Lee, Dr. Min; Najnudel, Dr. Joseph; Snaith, Dr. Nina; Walling, Dr. Lynne; Wooley, Prof. Trevor
Optimisation under Uncertainty
Classical optimisation deals with problems in which the objective function is precisely known, and the challenge is to develop efficient algorithms for problems with a large number of variables and constraints. But in many practical applications, there might be uncertainty either about the parameters in the objective function, or the function itself may be unknown. Such problems require a combination of inferring the objective function by choosing actions and observing rewards (or costs), and optimising over the imperfectly inferred objective function.
A common approach to such problems involves the use of probabilistic models to describe the objective function, and to provide a framework for jointly dealing with inference and optimisation. Secondly, many practical problems of this type are too large and complex to admit exact solutions. Therefore, a common approach is to derive bounds on the achievable performance of any algorithm, and to develop heuristics and show that these achieve performance close to the proven bounds.
Current research in the School touches upon multi-armed bandits, Markov decision processes and reinforcement learning, optimisation on random graphs, and applications to communications and computer science.
Quantum Mechanics, the theory of matter on small scales, plays a centrally important role in many of the most important areas of science and technology (e.g. lasers, mesoscopic and nanoscopic systems). However, few quantum systems can be solved analytically. For the rest, methods of approximation are required. Among these, asymptotic methods based on classical (Newtonian) mechanics are of increasing importance, especially in mesoscopic and nanoscopic systems, which lie at the boundary between the classical and quantum worlds. Within classical mechanics there is a broad spectrum of qualitatively different dynamics, ranging from integrable (completely regular) to strongly chaotic (highly irregular). Quantum chaos is the area of research concerned with how this fact manifests itself in quantum mechanics. It is an exciting and rapidly developing field, encompassing the mathematical analysis of new quantum phenomena and a wide variety of applications in many areas of science and technology (e.g in nanoscale systems and microlasers). There are deep connections with Random Matrix Theory - the study of the statistical distribution of the eigenvalues of matrices picked at random from some suitably defined ensemble - Ergodic Theory, and several areas of Number Theory, such as the theory of the Riemann zeta function and other related objects. Many fundamental developments in the subject have followed from work carried out here in Bristol. There is a close relationship with the Dynamical Systems and Quantum Information areas, and with the group in Physics led by Prof Sir Michael Berry FRS. There is also a close connection with the Number Theory and Ergodic Theory research areas.
Staff in this area: Dettmann, Prof. Carl; Keating, Prof. Jon; Marklof, Prof. Jens; Mezzadri, Prof. Francesco; Muller, Dr. Sebsatian; Robbins, Prof. Jonathan; Schubert, Dr. Roman; Sieber, Dr. Martin; Snaith, Dr. Nina; Tourigny, Dr. Yves
Quantum Computation and Quantum Information Theory
Recently the new subjects of quantum computation and quantum information theory have emerged which both offer the potential for immense practical computing power and also suggest deep links between the well-established disciplines of quantum theory and information theory and computation. On the one hand computer chips will soon be so small that we will have to grapple with the fact that electrons inside the processing elements become ‘smeared out’, for example they can tunnel out of the wires; Heisenberg’s Uncertainty Principle seems to be at odds with the desire for reliable computation. On the other hand, it has been realised very recently that one might be able to take advantage of intrinsically quantum features to build quite new types of computers ‘quantum computers’. We are only just beginning to understand what quantum information is and what quantum computer can do. We have close links with the physics and computer science departments and our group is interested in all aspects of quantum information theory (foundations, non-locality, entanglement, quantum Shannon theory, quantum computational models, and applications of ideas from quantum information to other fields such as statistical mechanics), and experiment (quantum key distribution, quantum photonics).
Random Matrix Theory
Random matrices are often used to study the statistical properties of systems whose detailed mathematical description is either not known or too complicated to allow any kind of successful approach. It is a remarkable fact that predictions made using random matrix theory have turned out to be accurate in a wide range of fields: statistical mechanics, quantum chaos, nuclear physics, number theory, combinatorics, wireless telecommunications, quantum field theory and structural dynamics, to name only few examples. One of the main reasons for this fascinating modelling power is that as the dimensions of the matrices tend to infinity the local statistical properties of the eigenvalues become independent of the probability distribution on the given matrix space. This important and long conjectured property of random matrices was proved only recently. This is a fast developing field of research. Several applications of random matrices are studied in Bristol, including quantum transport, quantum chaos, quantum information, number theory as well the universal properties of random matrices.
Random motion of a single particle or individual governed by simple stochastic rules is well understood by classical probability theory. The picture starts to be very different if the rules of motion are more complicated or we study a population of individuals that interact. Examples include random processes with memory, where the random behaviour of the particle is influenced by the particle’s own history, or interacting particle systems, where many simple walkers interact with each other, and various models arising in population biology and ecology where individuals interact with each other as well as possibly the external environment. Many real life systems can be studied using scaling limits, e.g. crystal growth, spread of infectious diseases, traffic jams, forest fires, spread of advantageous mutations. Mathematical theory often reveals unexpected behaviour. It is often difficult if not impossible to do calculations for these models, but scaling the parameters may be helpful in identifying limiting behaviour of these systems, about which we may be able to say something interesting. These limits can be in space or time, or both. Often, taking scaling limits connect many fields in mathematics, such as analysis, combinatorics, complex function theory, and partial differential equations, and is challenging and interesting in itself. To be able to quantify the phenomenon in some way, which may not be possible otherwise, is an added bonus.
Soft and Biological Matter
Soft Matter is the study of materials that are able to deform and flow at room temperature. They include colloids, granular materials and polymers, but also everyday materials such jam and shaving foam, and 'active materials' such as groups of cells or even swarms of fishes. This highly interdisciplinary field combines analytical tools from statistical mechanics and fluid dynamics with direct numerical computation using a variety of methods to understand how complex flow and complex mechanical properties arise from the simple properties of the individual building blocks. There are a number of active collaborations with experimentalists in the departments of Physics, Chemistry, Biochemistry, Physiology, Neuroscience, and Biology in the group.
Our evolutionary history lies embedded within our genome. Statistical genetics addresses questions such as: Have there been population bottlenecks or population explosions in the past? Has another group contributed to the genome of the current population? Is a certain genetic mutation under natural selection? The recent study showing roughly 2% of human DNA has Neanderthal origin, and strong genetic evidence that supports the out-of-Africa origin of modern humans, are both outstanding examples how we can infer historical events using only a sample of DNA extracted at present.
Statistical genetics is highly interdisciplinary and draws on many fields, including genetics, molecular biology, computer science, statistics and probability theory. Huge datasets such as the UK Biobank and 100,000 genomes project are becoming available as a result of low-cost genotyping and next-generation sequencing, and so this is yet another high dimensional & highly-structured data challenge, where the structure comes from the complexity of how genetic patterns are shared within and across generations.
Current research in the School focuses on how to construct computationally-efficient models that extract information from sequence data or whole-genome datasets to infer population parameters. Examples include the selection coefficients of mutations, fine-scale population structure, historical migration rates of one subgroup into another group, looking for evidence of evolution and response to environment change in the past, using whole-genome sequence data sampled from various sites and at various historical times. There are strong links with the Integrative Epidemiology Unit at Bristol, allowing methodology to be translated into application.
Interesting large scale behaviour can arise from short range local interactions between the constituent components of interacting particle systems. This can manifest as long range correlations and dependencies, as well as more spectacular phenomena such as a phase transitions, where internal symmetries break down. The study of probabilistic aspects of models of classical and quantum statistical physics involves many interesting topics, such as: phase transitions, fluctuations in the systems close to criticality or far from equilibrium, time dependent behaviour, diffusion in an environment that is spatially inhomogeneous but its statistical properties are translation invariant. They can lead to extremely complex and difficult mathematics. To aid the study of these problems, we sometimes derive partial differential equations governing the macroscopic motion of the system from microscopic principles.
Statistical Signal Processing
Uncertainty is present in various forms in numerous information engineering activities, for example, telecommunications, target tracking, sensor data fusion, signal and image processing. The present interdisciplinary research program at Bristol bridges the Statistics group and the Department of Electrical and Electronic Engineering by promoting the transfer of modern statistical methodology to the area of signal processing using the tools in concrete applications. Particular interests include approximate inference in large-scale statistical models, applications to communication and coding, machine learning, wavelet methods for data fusion, distributed computations, vesicle tracking in biological image processing and multiscale network visualisation.
Time Series Analysis
Time series are observations on variables indexed by time, or some other meaningful ordering. They are frequently collected in many areas such as finance, medicine, engineering, natural and social sciences. One example from the world of finance is daily quotes of share indices, such as FTSE 100. Time series analysis aims at: (i) finding a model that provides a good description of the main features of the data, and (ii) given the model and the data, forecasting and/or controlling the future evolution of the process. These two stages of analysis often require the development of novel procedures and algorithms which depend on the particular problem at hand. One important and wide-ranging example is statistical signal processing, where modern statistical methods are applied across a variety of information engineering activities,
such as telecommunications, target tracking, sensor data fusion, and signal and image processing.
Among many branches of time series analysis, change-point analysis allows some stochastic properties of the data to be time-varying. Change point detection problems have a relatively long history dating back at least to World War 2. This area is now going through a renaissance due to the emerging of complex data types, for instance high-dimensional vectors, high-dimensional matrices and networks. Lately there has been a renaissance in research for computationally fast and statistically efficient methods for change-point problems, in response to the emergence of large data sets observed in highly non-stationary environments. Current change-point research in the School covers theoretical properties of algorithms for a variety of models, and a range of applications including: the detection of DNA copy number aberrations in cancer research, structural break analysis in large financial datasets, and anomaly detection in computer networks which may indicate a cyber-attack.
Time series research in the School also includes: (i) work on non-stationary time series in terms of local autocovariance, partial autocovariance and spectral estimation and applications and improvements to forecasting in these situations. (ii) Questions related to the time series sample rate.; for example, is it possible to tell whether a series should be sampled at a faster rate, given a series at a particular rate, and whether this is cost-effective?