The Bristol AI Summer School (BAIS) 2025
The Interactive AI and the Practice-Oriented AI CDTs at the University of Bristol are holding an in-person summer school between the 9 and 11 September 2025. For three days the fundamentals and the latest progress in key areas of AI will be discussed by a range of experts from both industry and academia. Please note day 3 of the event is for our Interactive AI and Practice-Oriented AI CDT students and staff only. This year, the Summer School will focus on AI for health.
This year's event will take place at Wills Conference Hall in Bristol.
Programme:
Public Programme (subject to change)
Tuesday 9 September, 2025
"Solutions for stroke imaging"
"The Journey from Academia towards Deployment"
Abstract:
In this talk, Jeff will share the journey of turning academic research into the spin-out company IngeniumAI. Originally a postdoctoral research project led by Jeff, IngeniumAI is dedicated to improving detection of uncommon but deadly diseases by harnessing underutilised medical imaging data. Jeff will outline the path from initiating the research to forming a company and managing its day-to-day operations. The talk will explore key challenges in developing and deploying Software as a Medical Device (SaMD) and AI as a Medical Device (AIaMD) products, including funding, regulatory, and market access considerations.
Bio:
Dr Jeff Clark is co-founder and CEO of IngeniumAI, a spin-out company from the University of Bath and Royal United Hospitals Bath NHS Trust. Jeff has a background in medical engineering, with a PhD in 4D imaging for tissue regeneration from Imperial College London, and prior to this developing HIV diagnostics for remote regions at Cambridge and portable anaesthetic delivery systems at Cardiff. Alongside his role at IngeniumAI, Jeff is a part-time Research Fellow at the University of Bristol, where he applies machine learning to tackle pressing challenges for social good. In his spare time Jeff runs TheSavvyScientist.com, a resource aiming to make PhDs more accessible.
Wednesday 10 September, 2025
"Medical AI Safety"
"Beyond XAI: Explainable Data-driven Modelling for Human Reasoning and Decision Support"
Abstract:
Insights from social sciences have transformed explainable artificial intelligence from a largely technical into a more human-centred discipline, thus enabling diverse stakeholders, rather than technical experts alone, to benefit from its developments. The focus of explainability research itself, nonetheless, remained largely unchanged, that is to help people understand the operation and output of predictive models. This, however, may not necessarily be the most consequential function of such systems; they can be adapted to complement, augment and enhance the abilities of humans instead of (fully) automating their various roles in an explainable way. In this talk Kacper will explore how we can reimagine XAI by drawing upon a broad range of relevant interdisciplinary findings. The resulting, more comprehensive conceptualisation of the entire research field promises to be better aligned with humans by supporting their reasoning and decision-making in a data-driven way. As the talk will show, medical applications, as well as other high stakes domains, stand to greatly benefit from such a shift in perspective.
Bio:
Kacper is a researcher in the Faculty of Informatics at USI in Lugano, Switzerland. His main research focus is transparency – interpretability and explainability – of data-driven predictive systems based on artificial intelligence and machine learning algorithms intended for medical applications. Prior to joining USI, he worked with the Medical Data Science group at ETH Zurich. Before, he was a Research Fellow at the ARC Centre of Excellence for Automated Decision-Making and Society, affiliated with the RMIT University in Melbourne, Australia. Prior to that he held numerous research positions at the University of Bristol, United Kingdom, working on diverse AI and ML projects. Kacper holds a Master's degree in Mathematics and Computer Science and a doctorate in Computer Science from the University of Bristol.
"Vision-Language Models for Chest X-Rays"
Abstract:
Chest X-rays (CXR) is a widely requested imaging test used as a quick and non-invasive procedure to examine various pathologies in the chest cavity. To interpret these scans, radiologists integrate visual findings with multiple sources of patient information, such as medical history, and document the relevant findings visualised in the CXR into free-text radiology reports. However, an ever-increasing volume of imaging studies, coupled with a global shortage of radiologists, has created significant diagnostic backlogs that can delay treatment and negatively impact patient outcomes. This talk explores the potential of Vision-Language Models (VLMs) to help address this challenge. I will present different approaches to enhance VLM performance for key radiological applications, specifically focusing on the automation of generating accurate radiology reports and answering clinically relevant questions from CXRs.
Thursday 11 September, 2025 - Student-centered day for IAI and PrOAI CDT students
"Presenting with Confidence and Flair"
For many of us, presenting to an audience of peers and familiar academics is gruelling enough, speaking at a conference is far worse. The sense of exposure; the inner voice that seems to offer nothing but criticism; the prospect of being asked tough questions – all these can be paralysing.
This workshop equips research students with strategies to manage their nerves by offering ways to hold an audience’s attention, convey information and ideas clearly and respond to questions with authority and composure. Course participants have the opportunity to analyse their presentation style and explore ways of developing it.
"Trusted Research Workshop"
Trusted Research is becoming increasingly important as government oversight of university generated research grows. Recent geopolitical developments have heightened concerns around national security, particularly in relation to sensitive technologies and software developed within universities. Conducting due diligence on who we are collaborating with is a key priority, especially when it comes to ensuring the integrity of international research collaborations. This workshop will highlight the main risk indicators related to Export Control and other national security legislation, and explore how these considerations may apply to projects within the PrO-AI CDT.
"An Introduction to Responsible Research and Innovation"
This is an interactive session that invites participants to explore themes of responsibility, future impacts of research and researchers' personal standpoint through playful activities. The session will also introduce the AREA framework for RRI and provide an overview of the RRI work to date in the PrO-AI CDT.