Automatic Detection of Nociceptive Stimuli and Pain Intensity from Facial Expressions

Date:

I presented a poster at the 2017 Annual Meeting of the American Pain Society in Pittsburgh, describing collaborative work between the MIT Affective Computing group and MedImmune on automatic pain detection using computer vision and machine learning.

The project explored how deep neural networks can detect pain onset, offset, and intensity directly from facial expressions during controlled nociceptive stimulation. The goal was to develop tools that could complement or augment patient self-report and observer-based assessments, both of which are subjective, intermittently sampled, and often difficult to scale in clinical studies.

Our method was trained using the publicly available UNBC-McMaster Shoulder Pain Expression Archive Database, annotated using the Prkachin and Solomon Pain Intensity (PSPI) scale. We then evaluated the algorithm in a single-center, comparative, randomized, crossover clinical study examining the impact of different injection parameters on subcutaneous injection pain tolerance.

The poster presented results demonstrating the feasibility of continuous, automated pain quantification in clinical research settings, highlighting the potential for objective, scalable pain monitoring in both clinical trials and real-world treatment environments.