View all news

Lifting the lid on Isambard-AI, the UK’s fastest AI supercomputer

Isambard-AI hardware.

10 June 2025

As the doors open at ISC 2025 in Hamburg, we’re incredibly proud to share what’s been quietly taking shape in Bristol, UK. Introducing Isambard-AI, the UK’s fastest AI supercomputer and one of Europe’s most advanced AI-focused systems.

Isambard-AI puts incredible amounts of computational power in the hands of brilliant humans who have research projects or industry innovations that, when accelerated, will be world-changing.

Built by the Bristol Centre for Supercomputing at the University of Bristol, Isambard-AI represents a leap forward in our national capability to train large AI models, run complex scientific simulations, and collaborate across disciplines. It officially launches later this summer, and today we’re lifting the lid on specs, suppliers and usability.

An unprecedented pace

Unlike traditional supercomputing builds, which often take 3 to 5 years from funding to deployment, Isambard-AI was built in under 18 months. This was a feat made possible through:

  • Early supplier involvement in the design phase
  • Project streams running concurrently
  • A modular, containerised datacentre design
  • Pre-integrated hardware and fast-tracked system testing
  • Close collaboration across University of Bristol teams, contractors, and vendors

When the UK Government call for AI compute became urgent, we knew we had to move fast but without compromising on capability or sustainability. Thanks to an extraordinary team effort, we delivered just that.

Powering purposeful projects

Isambard-AI phase 2 will reach full capacity this summer. However, the rapid delivery of Isambard-AI phase 1 in summer 2024 has already enabled researchers to deliver a wide range of innovative research using AI. Establishing early access to vanguard users has allowed us to maximise the early impact. Since the start of 2025, almost 60 research projects have utilized around 100,000 GPU hours on Isambard-AI phase 1.

The breadth of research supported by Isambard-AI is large and rapidly expanding, with projects from a range of research sectors already making use of the platform. Examples include computational chemistry, medical imaging, and the arts, humanities and culture sectors, as well as fine tuning of Large Language Models and AI safety research.

We celebrated these early achievements at the first Isambard Day conference in March 2025, bringing together over 100 researchers, technical professionals and policymakers.

Powerful and efficient hardware

The computing power of Isambard-AI, designed and delivered by HPE, is based on the NVIDIA GH200 Grace Hopper Superchip, with a total of 5448 available across phases 1 and 2. Each GH200 consists of a 72-core NVIDIA Grace CPU with 120 GB of fast LPDDR5 memory, and an H100 GPU with 96 GB of HBM3 memory. Four GH200 are combined for each of Isambard-AI’s 1362 compute nodes, each with a huge combined 864 GB of memory in a single address space. Within in each node, the NVIDIA NVLink C2C interconnect provides a total of 800 Gbps of bandwidth. Four HPE Slingshot 11 interconnects, each running at 200 Gbps, provides fast communication across the system. Storage on Isambard-AI is based on ultra-fast solid-state technology, combining 20 PiByte Cray ClusterStor and 3.5 PiByte VAST solutions.

In total, Isambard-AI phase 2 is the most powerful supercomputer in the UK. Together, its 5280 GH200s can deliver over 200 PetaFLOP/s floating point performance at 64-bit for traditional HPC workloads, increasing to over 21 ExaFLOP/s when running at 8-bit for cutting edge AI workloads.

The GH200 Grace Hopper Superchip provides leadership performance in terms of energy efficiency. Isambard-AI phase 1 entered the May 2024 Green500 list at #2, making it one of the most energy efficient supercomputers that had ever been built. Isambard-AI utilises Direct Liquid Cooling, which provides efficient management of the 5 MW facility. This technology enables high power density, reducing the system’s physical footprint, and unlocks the pathway to efficient heat recovery for use in industrial processes or district heating schemes.

User experience matters

We deliver an integrated and holistic AI-focused User Experience, combining various support, training and outreach activities to enable user-driven decisions on technical enhancements to the service. This aims to maximise the inclusivity and equitability of the Isambard-AI experience.

The innovative approach to the software stack of Isambard-AI is designed to be sustainable and democratized. Users of Isambard-AI will be drawn from the widest range of research sectors, with differing needs and levels of AI and HPC experience.

As such, we are building an ecosystem that offers flexibility and minimises barriers to entry. At its heart, we are championing the concept of self-service, which allows users to port software stacks they already have. Support for containers, Conda and PIP provide flexible options for orchestrating machine learning frameworks, and Spack enables installation of many traditional HPC applications to facilitate hybrid simulation-AI use cases.

We are building our capabilities and certifications around security and secure data to provide Trusted Research Environments to unlock the opportunities presented by Isambard-AI to user groups with requirements in this area.

Isambard-AI in practice: BritLLM

Words provided by Pontus Stenetorp, Professor of NLP at University College London

Large language models (LLMs) have become a key component of any advanced natural language processing (NLP) system. Recently, we have seen an explosion in public and commercial interest in applying these technologies in academia, government, and industry. This technology promises to fundamentally change how humans interact with computers and enable large-scale automation of any task which involves the generation and processing of text or speech.

To this end, we launched the BritLLM project, an ongoing effort to produce training data, evaluation data, know-how, and open models aligned with UK interests. Our goal is to release competitive, open-source UK LLM models, while empirically quantifying the suitability of commercial and non-commercial models made available by other parties.

Pontus Stenetorp, Professor of NLP at UCL, said, ‘Isambard-AI is the most exciting development for natural language processing research, and LLM research in particular, in the UK to date. We have already been able to produce state-of-the-art, multilingual LLMs with the help of Isambard-AI and over the next few years it is our hope that with access to these resources we will be able to answer fundamental questions about how LLMs operate.

For example, to what degree do LLMs generalise from, as opposed to memorise, their training data? This is a key question in order to answer how widespread the adoption and impact of current AI approaches will be. Furthermore, how do we better encourage information to flow between languages so that LLMs can become more multilingual and equitable? These are ambitious, difficult questions, but ones that we now with the launch of Isambard-AI we can hope to be able to answer within this decade.

A huge collaborative effort

A project like this doesn’t happen without exceptional collaboration. Thank you to the teams who helped us make Isambard-AI a reality:

  • DSIT, UKRI & STFC – for funding and long-term support
  • HPE – for designing and building Isambard-AI with us
  • NVIDIA – for developing Isambard-AI's cutting-edge GPUs
  • Arm – for designing Isambard-AI's energy efficient CPU cores
  • Contour Advanced Systems – for designing a modular, sustainable, future-proof datacentre that could be built off-site and delivered rapidly
  • VAST – for our next-generation data storage
  • Vertiv and Danfoss – for the high-efficiency liquid cooling solution
  • Oakland Construction – for the extremely rapid and high-quality site build.

In the hands of brilliant humans

We believe Isambard-AI will help answer some of the biggest questions of our time. If you're interested in using the system, working with us, or exploring partnership opportunities, we’d love to hear from you.

This is just the beginning.

 

Edit this page