TSL Seminar Series, 2025-2026 Academic Year

This page contains information on Seminars that occurred during the 2025-2026 Academic Year. Details of upcoming talks, and how to attend, can be found on the main page: Seminar Series

1st October 2025, Model Based UAV Test Generation

We introduce the overall winner of the Uncrewed Aerial Vehicles (UAVs) Testing Competition at the 18th IEEE International Conference on Software Testing, Verification and Validation (ICST) 2025 and the 18th Intl. Workshop on Search-Based and Fuzz Testing (SBFT), and present an extension that leverages genetic algorithms in addition to a low-fidelity UAV path simulator to efficiently produce effective UAV test cases.
We also propose new metrics to measure UAV testing coverage and support prioritisation of test case selection. These metrics provide insights into both the diversity and situational relevance of the generated test cases.
Simulation-based testing provides a safe and cost-effective environment for verifying the safety of UAVs. However, identifying effective test suites requires a large number of simulations, which is resource consuming. To address this challenge, we optimise simulation resources using a model-based test generator that efficiently produces effective and diverse test suites. A genetic algorithm further enhances the test generation by employing a Neural Network (NN) as a surrogate fitness function to enable rapid evaluation of test cases. For the NN to make accurate predictions, it must be trained on a large dataset—one that cannot be feasibly generated using computationally intensive High-Fidelity Simulators (HFS). To overcome this, we simplify the PX4 autopilot HFS to develop a Low-Fidelity Simulator (LFS), which can produce the required training data an order of magnitude faster than the HFS.

Dr Anas Shrinah

A photograph of Dr Anas Shinrah

Dr Anas Shrinah is an Assistant Professor at the Applied Science University in Amman, Jordan, and an Honorary Senior Research Associate with the Trustworthy Systems Laboratory at the University of Bristol. His research focuses on leveraging artificial intelligence to generate effective test cases for cyber-physical systems. Anas led the development of the UAV test case generation tool that won first place in SBFT 2025 UAV Testing Competition and was named the overall winner in both the SBFT 2025 and ICST 2025 UAV Testing Competitions.
Anas is a certified Project Management Professional (PMP) with over 16 years of combined experience in academia and industry. He holds a PhD in the verification and validation of planning-based autonomous systems, an MSc in Robotics (with Distinction), as well as a first-class honours BEng in Computer and Automation Engineering.
Dr Chris Bennett

A photograph of Dr Chris Bennett
Dr Chris Bennett is a Senior Research Associate with the Trustworthy Systems Laboratory at the University of Bristol, developing machine learning techniques for test-based verification. A chartered engineer with a background in systems engineering for automotive, he worked at Jaguar Land Rover before transitioning into research seven years ago, completing a PhD in Robotics and Autonomous Systems. He has previously worked on projects with Thales UK, examining the role of hybrid autonomy in multi-agent systems, and on the UKRI funded Trustworthy Autonomous Systems project, investigating how trust can be built in artificial intelligence and robotics through system engineering practices. His research interests include test-based verification, system engineering design practices, and multi-agent artificial intelligence.

8th October 2025, Systems Trustworthiness for Human Rights

Trustworthy systems need to consider factors such as privacy-by-design, safety-by-design, and security-by-design. These all form part of upholding and respecting human rights, but alone they are not enough. This session will discuss some of the often-overlooked considerations when designing for real-world deployment, and what is going on in the compliance world to try and standardise these approaches. This session will be of use to anyone designing and deploying technology systems, even (and especially) if they are unfamiliar with their obligations to respect human rights.

Beckett LeClair

Beckett is the Head of Compliance at 5Rights Foundation, an NGO working internationally to uphold the rights of young citizens as they interface with the digital world. He is involved in standards development in multiple jurisdictions, especially AI standards at the European level, and has a particular interest in ensuring technology respects the freedoms of vulnerable and/or marginalised citizens. Beckett was previously a Senior Engineer in Frazer-Nash as part of the Digital Assurance team, with a focus on cyber security and responsible AI.