TY - GEN
T1 - The AI4Pain Grand Challenge 2024
T2 - 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2024
AU - Fernandez-Rojas, Raul
AU - Joseph, Calvin
AU - Hirachan, Niraj
AU - Seymour, Ben
AU - Goecke, Roland
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The Multimodal Sensing Grand Challenge for NextGen Pain Assessment (AI4PAIN) is the first international competition focused on automating the recognition of acute pain using multimodal sensing technologies. Participants are tasked with classifying pain intensity into three categories: No Pain, Low Pain, and High Pain, utilising functional near-infrared spectroscopy (fNIRS) and facial video recordings. This paper presents the baseline results of our approach, examining both individual and combined modalities. Notably, this challenge represents a pioneering effort to advance pain recognition by integrating neurological information (fNIRS) with behavioural data (facial video). The AI4Pain Grand Challenge aims to generate a novel multimodal sensing dataset, facilitating benchmarking and serving as a valuable resource for future research in autonomous pain assessment. The results show that individual fNIRS data achieved the highest accuracy, with 43.2% for the validation set and 43.3% for the test set, while facial data yielded the lowest accuracy, with 40.0% for the validation set and 40.1% for the test set. The combined multimodal approach produced accuracies of 40.2% for the validation set and 41.7% for the test set. This challenge provides the research community with a significant opportunity to enhance the understanding of pain, ultimately aiming to improve the quality of life for many pain sufferers through advanced, automated pain assessment methods.
AB - The Multimodal Sensing Grand Challenge for NextGen Pain Assessment (AI4PAIN) is the first international competition focused on automating the recognition of acute pain using multimodal sensing technologies. Participants are tasked with classifying pain intensity into three categories: No Pain, Low Pain, and High Pain, utilising functional near-infrared spectroscopy (fNIRS) and facial video recordings. This paper presents the baseline results of our approach, examining both individual and combined modalities. Notably, this challenge represents a pioneering effort to advance pain recognition by integrating neurological information (fNIRS) with behavioural data (facial video). The AI4Pain Grand Challenge aims to generate a novel multimodal sensing dataset, facilitating benchmarking and serving as a valuable resource for future research in autonomous pain assessment. The results show that individual fNIRS data achieved the highest accuracy, with 43.2% for the validation set and 43.3% for the test set, while facial data yielded the lowest accuracy, with 40.0% for the validation set and 40.1% for the test set. The combined multimodal approach produced accuracies of 40.2% for the validation set and 41.7% for the test set. This challenge provides the research community with a significant opportunity to enhance the understanding of pain, ultimately aiming to improve the quality of life for many pain sufferers through advanced, automated pain assessment methods.
KW - AI
KW - Face
KW - fNIRS
KW - Multimodal
KW - Pain Assessment
UR - http://www.scopus.com/inward/record.url?scp=105004796340&partnerID=8YFLogxK
UR - https://acii-conf.net/2024/people/organizing-committee/
U2 - 10.1109/ACIIW63320.2024.00012
DO - 10.1109/ACIIW63320.2024.00012
M3 - Conference contribution
AN - SCOPUS:105004796340
T3 - Proceedings - 2024 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2024
SP - 55
EP - 60
BT - Proceedings - 2024 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2024
A2 - Mahmoud, Marwa
A2 - Celiktutan, Oye
A2 - Canamero, Lola
PB - IEEE, Institute of Electrical and Electronics Engineers
Y2 - 15 September 2024 through 18 September 2024
ER -