TY - JOUR
AU - Aldenhoven, Céline Madeleine
AU - Nissen, Leon
AU - Heinemann, Marie
AU - Dogdu, Cem
AU - Hanke, Alexander
AU - Jonas, Stephan
AU - Reimer, Lara Marie
TI - Real-Time Emotion Recognition Performance of Mobile Devices: A Detailed Analysis of Camera and TrueDepth Sensors Using Apple's ARKit.
JO - Sensors
VL - 26
IS - 3
SN - 1424-8220
CY - Basel
PB - MDPI
M1 - DZNE-2026-00199
SP - 1060
PY - 2026
AB - Facial features hold information about a person's emotions, motor function, or genetic defects. Since most current mobile devices are capable of real-time face detection using cameras and depth sensors, real-time facial analysis can be utilized in several mobile use cases. Understanding the real-time emotion recognition capabilities of device sensors and frameworks is vital for developing new, valid applications. Therefore, we evaluated on-device emotion recognition using Apple's ARKit on an iPhone 14 Pro. A native app elicited 36 blend shape-specific movements and 7 discrete emotions from N=31 healthy adults. Per frame, standardized ARKit blend shapes were classified using a prototype-based cosine similarity metric; performance was summarized as accuracy and area under the receiver operating characteristic curves. Cosine similarity achieved an overall accuracy of 68.3
KW - Humans
KW - Emotions: physiology
KW - Adult
KW - Male
KW - Female
KW - Mobile Applications
KW - Smartphone
KW - Young Adult
KW - Facial Expression
KW - ARKit (Other)
KW - emotion recognition (Other)
KW - face tracking (Other)
KW - real-time (Other)
KW - sensors (Other)
LB - PUB:(DE-HGF)16
C6 - pmid:41682575
C2 - pmc:PMC12899966
DO - DOI:10.3390/s26031060
UR - https://pub.dzne.de/record/285257
ER -