An algorithm that estimates someone’s pain levels by looking at their face could help stop doctors prescribing painkillers to people who don’t need them
Limited VIP tickets remaining – Book now
Putting on a brave face won’t fool this algorithm. A new system that rates how much pain someone is in just by looking at their face could help doctors decide how to treat patients. By examining tiny facial expressions and calibrating the system to each person, it provides a level of objectivity in an area where that’s normally hard to come by.
“These metrics might be useful in determining real pain from faked pain,” says Jeffrey Cohn at the University of Pittsburgh in the US. The system could make the difference between prescribing potentially addictive painkillers and catching out a faker.
Objectively measuring pain levels is a tricky task, says Dianbo Liu, who created the system with his colleagues at the Massachusetts Institute of Technology. People experience and express pain differently, so a doctor’s estimate of a patient’s pain can often differ from a self-reported pain score.
In an attempt to introduce some objectivity, Liu and his team trained an algorithm on videos of people wincing and grimacing in pain. Each video consisted of a person with shoulder pain, who had been asked to perform a different movement and then rate their pain levels. The result was an algorithm that can use subtle differences in facial expressions to inform a guess about how a given person is feeling.
Certain parts of the face are particularly revealing, says Liu. Large amounts of movement around the nose and mouth tended to suggest higher self-reported pain scores.
There’s evidence that even less sophisticated pain-recognition algorithms are less easily fooled than their human counterparts. A study from the University of California, San Diego, found that a computer system could weed out fakers 85 per cent of the time, whereas trained humans were only accurate 55 per cent of the time.
To help make it more accurate, Liu’s system can be tweaked to take into account someone’s age, sex and skin complexion. An individual’s age had the most impact on their expression of pain levels, and Liu found that his personalised approach was better at estimating pain than one-size-fits-all systems.
Cohn is impressed with the results and says it’s the first time he’s seen a pain recognition algorithm that can be tweaked to give personalised results based on age, sex, and skin complexion. It’s still early days, but Liu says there’s nothing stopping the system eventually being made into an app that doctors could have on their smartphones.
Liu says the system could never be a replacement for real doctors. The videos his algorithm was trained on were taken in ideal lighting and photography conditions, so it’s unlikely the system would be as accurate if it was used on real patients. Still, he’s planning to further train the algorithm with more videos of people in pain to see if that boosts its pain-rating abilities.
Read more: AI rivals dermatologists at spotting early signs of skin cancer; Smart cameras spot when hospital staff don’t wash their hands
More on these topics: