CEM Research Seminar 1 - Adaptable Robots, Ethics, and Trust: A qualitative and philosophical exploration of Trustworthy AI

Please join us for the first event in our 2023/24 Centre for Ethics in Medicine (CEM) Research Seminar series with our very own Stephanie Sheir, Research Associate in Engineering Ethics.

Abstract

Much has been written about the need for trustworthy artificial intelligence (AI), but the underlying meaning of trust and trustworthiness can vary or be used in confusing ways. It is not always clear whether individuals are speaking of a technology’s trustworthiness, a developer’s trustworthiness, or of gaining the trust of users through other means.  In sociotechnical circles, trustworthiness is often used as a proxy for ‘the good’, illustrating the moral heights that technologies and developers ought to aspire to, at times with a multitude of diverse requirements; or at other times, no specification at all. In philosophical circles, there is doubt that the concept of trust should be applied at all to technologies rather than their human creators. Nevertheless, people continue to intuitively reason about trust in their everyday language. This qualitative study employed an empirical ethics methodology to address a) how developers theorise and operationalise trust and trustworthiness during the development and deployment of AI, and b) how and why users place trust in AI, through a series of interviews. It was found that different accounts of trust (rational, affective, credentialist, norms-based, relational) served as the basis for individual granting of trust in technologies and operators. Ultimately, the most significant requirement for user trust and assessment of trustworthiness was the accountability of AI developers for the outputs of AI systems, hinging on the identification of accountable moral agents and perceived value alignment between the user and developer’s interests.  

Contact information

To book your place please complete your details on our booking form

If you have any questions, please email Dani O'Connor