Department of Engineering / News / Introducing RoboPatient – a soft robotics approach to training doctors

Department of Engineering

Introducing RoboPatient – a soft robotics approach to training doctors

Introducing RoboPatient – a soft robotics approach to training doctors

Cambridge researchers are helping to design healthcare robotics of the future by investigating a robot-assisted approach to training doctors in medical examinations.

We have shown how assessing the robot’s palpation techniques may aid in finding those motions which can best diagnose hard inclusions in soft tissue – this is without the need for explicit labels or knowledge of whether the tissue under palpation has abnormal lumps.

Luca Scimeca

One of the aims of the collaborative project, known as ‘RoboPatient’, funded by the EPSRC, is to gain a deeper understanding of how recent advances in soft robotics and tactile sensing, for example, can be used to teach trainee doctors the often complex skill of palpation. Palpation is a method used by doctors during a physical examination, where they feel with their fingers or hands the size, texture, location and so on, of an organ or body part. It is a technique that is difficult to teach and one that requires a considerable amount of hands-on practice with real patients.

Post-Doctoral Research Associate Luca Scimeca, from the Department of Engineering’s Bio-Inspired Robotics Lab, is working alongside researchers at Imperial College London and the University of Oxford. He is applying his background in artificial intelligence to robotic systems, by using machine learning to optimise the tactile sensing capabilities of a robot carrying out palpation. As it probes a soft phantom organ (made of silicone rubber) the robot detects and classifies hard inclusions within it. The results of the study are published in the journal Autonomous Robots.

Using a pressure sensor on the end of a robotic arm, Luca and the research team were able to retrieve tactile images during the probing experiments. This process enables the sense of touch to be translated into a digital image. An Intelligent Hub Board collects the tactile sensor data, and machine learning techniques are used to detect and classify hard inclusions within the robotic palpation system. The aim of this data-driven approach is to help reinforce the learning and training process of medical palpation, leading to increased efficiency and performance. 

Whereas in previous research there has been less focus on the physical palpation techniques employed by the robot during the palpation examination, this study investigates the effects of various motion strategies on the response of the pressure sensor. The researchers use vertical and rotatory probing strategies to better align with the skill of manual palpation, which typically involves a complex mix of methods, including varying the amount of pressure applied, using different finger positions, as well as adopting various tapping techniques. 

With the assistance of a support-vector machine (a supervised learning model where the algorithm learns on a labelled dataset), the research team demonstrate how the proposed framework is capable of predicting the best-performing motion strategy needed for the most discriminative palpation actions.

Luca said: “The results of our study present a framework that we feel can be most effective when used as a pre-learning step, before any actual supervised learning takes place. We have shown how assessing the robot’s palpation techniques may aid in finding those motions which can best diagnose hard inclusions in soft tissue – this is without the need for explicit labels or knowledge of whether the tissue under palpation has abnormal lumps. Our approach enables the robot to make an informed choice when it comes to determining what type of touch-enabled interaction is likely to be the most discriminative.”

The next steps of the RoboPatient project will focus on the patient side of the palpation process. In collaboration with Imperial College London, Luca and the research team will collect data on how palpation is felt/experienced by the patient in order to create a more realistic scenario, capturing reactions such as pain, facial expressions and even verbal cues. Imperial College London's latest soft phantom organ is integrated with an array of sensors that can measure, for example, the pressure that's being applied. This, together with a modular robotic face designed to present pain expressions, can help guide the palpation procedure and influence how the doctor performs the palpation. 

Find out more about RoboPatient.

Reference:
Scimeca, L; Maiolino, P; Bray, Ed; Iida, F. ‘Structuring of tactile sensory information for category formation in robotics palpation’. Autonomous Robots (2020). DOI: 10.1007/s10514-020-09931-y

Research Associates Luca Scimeca and Dr Simon Hauser contribute to a virtual workshop on RoboPatient. Credit: Morphlab ICL, YouTube.

The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways that permit your use and sharing of our content under their respective Terms.