Oliver Deane

General Profile:

Having graduated from the University of York with a BSc in Psychology, I went on to obtain an MSc in Computational Neuroscience and Cognitive Robotics from the University of Birmingham. At BSc level, my dissertation focused on memory consolidation, while my MSc research aimed to automate the annotation of eye tracking data with computer vision architectures such as Mask R-CNN.  

Since this, I have conducted various volunteer projects exploring how AI can best be implemented as a force for social good. This experience ranged from NLP solutions that augment charitable operations, to the effective implementation of mental health chatbots.  

I look forward to combining these interests throughout my PhD and am currently investigating research avenues that utilize NLP methodologies to explore the propagation of misinformation across social online environments.   

Research Project Summary:

Individuals with Autism Spectrum Disorder (IwASD) commonly experience impairments in social interaction, often manifesting as an inability to communicate in a manner that accords with normative expectations. Literature frequently attributes such deficits to an underdeveloped Theory of Mind, stating that IwASD’s social engagement is hindered by a limited understanding of others’ perspectives and intentions. In this research, we contribute a novel approach to training these core perspective-taking capabilities by combining two emerging sub-fields of autism research: 1) Relational Frame Theory (RFT), a theory of language and cognition increasingly utilized as a theoretical basis for modern autism interventions, and 2) robot-enhanced therapy; defined as the use of robotic assistive tools for strengthening IwASD’s social interactions. Accordingly, we present a prototypical semi-autonomous robotic system specialized for enhancing RFT-based training sessions. The system is built upon the NAO humanoid robot and incorporates a series of sensory components that maximize autonomy, and an interactive component that permits operator oversight. Once developed, it was deployed during an RFT training session conducted by a trained therapist on a neurotypical 3-year-old child, with the aim of 1) evaluating current performance of the presented system, and 2) identifying future functionality based on observations of therapist and child behaviors. In response to the first, a key finding emphasizes that improving the robot's auditory processing by integrating a sound localization module would enhance the robot's autonomy and reduce reliance on the on-hand human operator. Further, augmenting its vision capability by training an underlying neural network to deal with more extreme brightness variation would similarly bolster the robot's robustness during real clinical sessions. The observation analysis then provides additional insights to be considered by relevant future research. Notable lessons highlight the potential value of a robotic system that can alleviate therapist workload in an RFT context and recommend an engagement-detection sub-system that would monitor, and respond to, a child's variable attention levels. The paper concludes with suggestions for how future work can act upon these insights and ensure continued progress towards an autonomous robotic system specialized for RFT-based autism interventions.





Edit this page