MorphFace Robots with Realistic Pain Expression for Reduced Errors, Bias by Doctors Developed

Researchers from the United Kingdom have developed an approach to produce realistic and precise expressions of the face on the medical training during painful areas' physical examination.

A report from The Tribune specified that this new method by the Imperial College London researchers could contribute to the reduction of errors and bias by doctors during physical examinations.

Findings of the study have suggested this could help train trainee doctors to use hints in patient facial expressions to minimize the force needed for physical examinations, and may help as well, in the detection and correction of early signs of bias in medical students by exposing them to a greater range of patient identities.

According to Sibylle Rerolle, at Imperial's Dyson School of Design Engineering, enhancing the accuracy of facial expression of pain on these robots is a "key step" in the improvement of the quality of physical examination training for medical students.

MorphFace Robots with Realistic Pain Expression for Reduced Errors, Bias by Doctors Developed
According to a study, enhancing the accuracy of facial expression of pain on these robots is a ‘key step’ in the improvement of the quality of physical examination training for medical students. Pexels/Laura Musikanski


'MorphFace'

In the research, undergraduate students were asked to carry out a physical examination on the robotic patient's abdomen, a related International Business Times report said.

Data showing the force applied to the abdomen was employed to stimulate changes in six different areas of the robotic face called the "MorphFace," to duplicate pain-related facial expressions.

This approach showed the order in which various regions of a robotic face, known as facial AUs or activation units, must trigger to generate the most precise expression of pain. The study determined too, the most appropriate speed and magnitude of AU activation.

The researchers discovered that the most realistic expressions occurred when the upper face AUs surrounding the eyes were stimulated first, followed by the lower face AUs surrounding the mouth.

Specifically, a longer delay in the activation of the Jaw drop AU generated the most natural results. When doctors perform a physical examination of painful areas, the patient facial expression's feedback is essential.

Nonetheless, numerous current medical training simulators cannot exhibit real-time expressions associated with pain and include a limited number of patient activities in terms of gender and ethnicity.

An Approach for Better Medical Practice

As specified in a Free Press Journal report, the researchers explained these limitations could lead medical students to develop biased practices with research already emphasizing bis in the ability to recognize facial expressions of pain.

Dyson School of Design Engineering's Thilina Lalotharatne, the study's co-author said underlying biases could cause doctors to misinterpret the patients' discomfort, increasing the threat of mistreatment, adversely affecting doctor-patient trust, and even causing death.

In the future, added the co-author, a robot-assisted approach could be employed to train medical students to normalize their perceptions of pain which the patients of different gender and ethnicity express.

Related information about a humanoid robot helping reduce pain is shown on IHDCYH Talks' YouTube video below:

Check out more news and information on Robotics in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics