Objectives The Automatic Pain Assessment (APA) relies on the exploitation of objective methods to evaluate the severity of pain and other pain-related characteristics. Facial expressions are the most investigated pain behavior features for APA. We constructed a binary classifier model for discriminating between the absence and presence of pain through video analysis.Methods A brief interview lasting approximately two-minute was conducted with cancer patients, and video recordings were taken during the session. The Delaware Pain Database and UNBC-McMaster Shoulder Pain dataset were used for training. A set of 17 Action Units (AUs) was adopted. For each image, the OpenFace toolkit was used to extract the considered AUs. The collected data were grouped and split into train and test sets: 80 % of the data was used as a training set and the remaining 20 % as the validation set. For continuous estimation, the entire patient video with frame prediction values of 0 (no pain) or 1 (pain), was imported into an annotator (ELAN 6.4). The developed Neural Network classifier consists of two dense layers. The first layer contains 17 nodes associated with the facial AUs extracted by OpenFace for each image. The output layer is a classification label of "pain" (1) or "no pain" (0).Results The classifier obtained an accuracy of & SIM;94 % after about 400 training epochs. The Area Under the ROC curve (AUROC) value was approximately 0.98.Conclusions This study demonstrated that the use of a binary classifier model developed from selected AUs can be an effective tool for evaluating cancer pain. The implementation of an APA classifier can be useful for detecting potential pain fluctuations. In the context of APA research, further investigations are necessary to refine the process and particularly to combine this data with multi-parameter analyses such as speech analysis, text analysis, and data obtained from physiological parameters.

Development of a binary classifier model from extended facial codes toward video-based pain recognition in cancer patients

Cascella, Marco
;
2023-01-01

Abstract

Objectives The Automatic Pain Assessment (APA) relies on the exploitation of objective methods to evaluate the severity of pain and other pain-related characteristics. Facial expressions are the most investigated pain behavior features for APA. We constructed a binary classifier model for discriminating between the absence and presence of pain through video analysis.Methods A brief interview lasting approximately two-minute was conducted with cancer patients, and video recordings were taken during the session. The Delaware Pain Database and UNBC-McMaster Shoulder Pain dataset were used for training. A set of 17 Action Units (AUs) was adopted. For each image, the OpenFace toolkit was used to extract the considered AUs. The collected data were grouped and split into train and test sets: 80 % of the data was used as a training set and the remaining 20 % as the validation set. For continuous estimation, the entire patient video with frame prediction values of 0 (no pain) or 1 (pain), was imported into an annotator (ELAN 6.4). The developed Neural Network classifier consists of two dense layers. The first layer contains 17 nodes associated with the facial AUs extracted by OpenFace for each image. The output layer is a classification label of "pain" (1) or "no pain" (0).Results The classifier obtained an accuracy of & SIM;94 % after about 400 training epochs. The Area Under the ROC curve (AUROC) value was approximately 0.98.Conclusions This study demonstrated that the use of a binary classifier model developed from selected AUs can be an effective tool for evaluating cancer pain. The implementation of an APA classifier can be useful for detecting potential pain fluctuations. In the context of APA research, further investigations are necessary to refine the process and particularly to combine this data with multi-parameter analyses such as speech analysis, text analysis, and data obtained from physiological parameters.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4855931
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact