Data Intensive PhD positions
The School of Physics at the University of Bristol expects to offer several Ph.D. studentships associated with a STFC-funded Centre for Doctoral Training in Data-Intensive Science. This CDT has been established by a consortium of Bristol, Cardiff and Swansea Universities in order to deliver specialist training in data intensive techniques addressing science questions across particle physics, astrophysics and gravitational physics.
The studentships are for 4 years and include six months of industrial placement with partner organisations in the commercial, industrial or private sectors. These placements are likely to occur during the third year and will entail applying the data science techniques acquired during the PhD training to areas that do not necessarily relate to the thesis topic. Applications are welcome from UK, EU and other international applicants.
Bristol is offering research projects in both Astrophysics and Particle Physics to start in October 2017. Potential projects (with supervisors) include:
X-ray emission from black holes (Young)
The highly-variable X-ray emission around rotating black holes is studied with X-ray spectroscopy in comparison with complicated models of the radiative transfer process in a highly-structured space-time and gas environment. Models are created using GPU-based codes, and are controlled by a large number of parameters: model fitting is complicated and requires working through a high-dimension parameter space.
Polarized radio maps (Birkinshaw)
At the faintest surface brightness levels radio interferometer images are severely distorted by imperfections in the large volumes of data coming in from telescopes such as the JVLA or LOFAR. Our investigations into the magnetization of the intergalactic medium require further development of techniques for making polarized radio maps at such faint levels - to track the filamentary outer structures now appearing - and then requires simulations of the amplification of injected magnetic field in the turbulent intergalactic medium.
Simultaneous self-consistent modelling of galaxy properties (Bremer)
The various properties of galaxies are usually measured independently of each other, taking little account of the fact that each influences the others. Also usually ignored is that these properties vary across an individual galaxy when it is separated into its various components (e.g. bulge and disk). To accurately model individual galaxies and their evolution requires a simultaneous and self-consistent modelling of the multiple properties and components for each system. The combination of giant data sets from a new generation of astronomical facilities (e.g. LSST, Euclid, JWST and others) will allow such modelling of huge samples of galaxies over a large range in redshift. This project will develop, test and exploit tools to carry out this modelling using large existing data sets in preparation for their use with the new generation of facilities.
Studying invisible galaxy clusters (Maughan)
Forthcoming galaxy surveys will detect tens of thousands of clusters of galaxies in the distant Universe. The amount and properties of plasma in these clusters can be measured by their X-ray emission, and probes the evolution of the clusters and the history of feedback from supernovae and supermassive black holes within their galaxies. The X-ray emission from individual distant clusters is often too faint to be seen, but by developing new statistical and computational techniques we will combine information from many such "invisible" clusters to probe their properties well beyond the limits of what has previously been possible.
The formation and evolution of rocky planets (Leinhardt)
Thousands of extrasolar planets have been detected over the past few decades with a broad range of physical and orbital parameters. It is, however, still unclear why planet formation is common and what determines the final outcomes. In order to investigate these questions a combination of high resolution, parallelised N-body and hydrocode simulations will be used to determine the importance of impacts on the orbital and compositional diversity in the extrasolar planet population.
LHCb (Rademacker, Petridis)
Precision flavour physics uses huge data samples to study the asymmetry between matter and antimatter, rare decays of hadrons and new phenomena in weak and strong interactions. This approach is sensitive to exotic virtual particles far heavier than those produced even in the highest-energy collisions; such particles are predicted by many theories addressing questions left open by the Standard Model of particle physics, such as the nature of Dark Matter. The required sensitivity can only be achieved through analysing very large datasets. In this set of projects, new ways of using “big data” technologies are employed to analyse, filter, and simulate the vast amount of data being accumulated at the world’s leading flavour physics experiment, LHCb. These techniques will include modern machine learning methods combined with the use of highly parallelised computing architectures for event filtering, data analysis and Monte Carlo simulations. There are also opportunities for collaboration with the theory community, as well as collaborative approaches across several experiments.
DUNE (Rademacker, Newbold, Brooke)
DUNE is a new, large-scale international project whose key aim is to measure charge-parity asymmetry in neutrinos; its discovery could hold the key to understanding the baryon asymmetry of the universe, one of the biggest open questions in particle physics and cosmology. To perform this measurement, an intense neutrino beam will be fired through the earth’s crust into four 10kt liquid argon time projection chambers. This beautiful detector technology produces vast amounts of data, and exquisitely precise, high-resolution 3-D pictures of the charged particle tracks resulting from neutrino interactions (as well as backgrounds). DUNE is currently in an R&D/pre-construction stage. There are exciting challenges and opportunities in DUNE data acquisition, event filtering, and pattern recognition, as well as the analysis of DUNE prototype data.
Machine learning for the SHiP experiment (Petridis)
The SHiP (Search for Hidden Particles) experiment is a future experiment at CERN, aimed at searching for such light long lived particles, by colliding a beam of protons to a fixed target. One of the challenges of this experiment is the reduction of the large rate (~100GHz) of background muons produced by the initial collision. A combination of a magnetic sweeping shield together with multiple veto systems are required in order to reduce this background to less than one pair of muons over the lifetime of the experiment. Machine Learning techniques will need to be employed in order to optimise the design of the magnetic shield and for pattern recognition using a combination veto, tracking and timing detectors. The SHiP experiment will also search for Light Dark Matter particles produced from the initial proton collision, through the elastic scattering of the LDM particle with electrons and protons of SHiP's emulsion detector. This is a novel approach to searching for DM and will require the use Machine Learning techniques both for classification of events over the vast neutrino induced scattering as well as for regression in order to measure the energy of the scattered electron.
CMS (Brooke, Flaecher, Goldstein, Newbold)
Following the discovery of the Higgs boson in 2012, the CMS experiment is continuing its programme to precisely measure the parameters of the Standard Model of particle physics and search (both directly and indirectly) for new physics phenomena. The high energy data from the LHC could reveal the nature of dark matter, signs of supersymmetry, or the evidence for new forces of nature. The volume of data created in LHC collisions is huge, requiring advanced filtering and analysis algorithms to extract the most interesting signals. Projects in this area would apply the latest "big data" techniques to CMS data processing, both online and offline, in order to maximise the sensitivity of CMS to a number of new physics scenarios.
Data Acquisition at Future Colliders (Brooke, Newbold)
Hadron colliders have historically presented a cutting edge challenge in data acquisition. The high luminosities required, coupled with finely segmented detectors, result in data rates far in excess of what can practically be stored. In the past this challenge has been met using combinations of custom electronics processors with off-the-shelf computing to analyses and select events in real time as they are produced. Future colliders, such as the HL-LHC upgrade, or a 100 TeV circular collider (FCC), will produce data at ever higher rates, and present exciting opportunities to explore the use of advanced technologies in real time processing and data acquisition. These include both hardware acceleration techniques, for example the use of FPGAs, GPUs and other co-processors, as well as software-based techniques such as machine learning.
Development of novel dosimeters for radiotherapy (Supervisor: Velthuis)
The PhD project is about developing novel dosimeters for conventional radiotherapy and novel treatments. With the advent of Intensity Modulated Radiotherapy (IMRT) it has become possible to deliver treatment with very high precision. High precision delivery is however more valuable when this information can be combined with the position of the tumour which implies that the dosimetry needs to be done with high precision in a high magnetic field. Modern day treatment and dosimetry requires extensive Monte Carlo simulations to calculate the deposited energy in patient and in the detectors. This work is about the interaction of low energy particles (photons and electrons with an energy between 0.1 and 10 MeV) with the detector materials and the patient and is thus very similar to standard particle and astro detector design. Large Monte Carlo programs already exist for normal IMRT machines, but the aim for this project is to develop the capability within the MRI machine where it is more complicated due to the high magnetic field and the material. We have the facilities to crosscheck the simulations in an MRI-IMRT machine. We also want to study in Monte Carlo how we can use the signal of the outgoing beam to further enhance the knowledge of the full 3D dose deposition.
The work encompasses a large Monte Carlo simulation program to optimize the designs of prototypes that will be produced in a different PhD project (funding received, start Sept 2017). We aim to produce functional prototypes for both applications. Proposed supervisors are Dr Hugtenburg (Swansea, Bristol) and Dr Velthuis (Bristol, Swansea) and it is foreseen to collaborate with the Netherlands Cancer Institute. The student would be mainly based in Swansea but work closely with the team based in Bristol.
How to Apply
To apply for one of the studentships offered by Bristol, please use the on-line application system and follow the standard procedure for a PhD position in Physics (including all relevant supplementary information - CV, personal statement, transcripts and arranging for at least two academic references to be subsequently provided). Within the "Research Details" section mention "Data Intensive CDT" and whether you are applying for Astrophysics, Particle Physics or both. Applications will be considered as they are received until a deadline of Monday 29th May 2017.
For further details about projects in Astrophysics please contact Professor Malcolm Bremer (email@example.com). For those in Particle Physics, and for any general enquiries please contact Dr Henning Flaecher (firstname.lastname@example.org).
The School of Physics is committed to good employment practices and a supportive working environment for all staff. We continuously review our working practices to ensure that all staff and students are well supported in their work and study. The school holds an Athena SWAN bronze award. The Equality Challenge Unit's Athena SWAN Charter recognises commitment to advancing women's careers in science, technology, engineering, maths and medicine (STEMM) employment in higher education and research. The School of Physics also holds an Institute of Physics Juno Practitioner award. The aim of the Institute of Physics Juno awards is to recognise and reward departments that can demonstrate they have taken action to address the under-representation of women in university physics and to encourage better practice for both women and men.