%0 Journal Article
%A Aldenhoven, Céline Madeleine
%A Nissen, Leon
%A Heinemann, Marie
%A Dogdu, Cem
%A Hanke, Alexander
%A Jonas, Stephan
%A Reimer, Lara Marie
%T Real-Time Emotion Recognition Performance of Mobile Devices: A Detailed Analysis of Camera and TrueDepth Sensors Using Apple's ARKit.
%J Sensors
%V 26
%N 3
%@ 1424-8220
%C Basel
%I MDPI
%M DZNE-2026-00199
%P 1060
%D 2026
%X Facial features hold information about a person's emotions, motor function, or genetic defects. Since most current mobile devices are capable of real-time face detection using cameras and depth sensors, real-time facial analysis can be utilized in several mobile use cases. Understanding the real-time emotion recognition capabilities of device sensors and frameworks is vital for developing new, valid applications. Therefore, we evaluated on-device emotion recognition using Apple's ARKit on an iPhone 14 Pro. A native app elicited 36 blend shape-specific movements and 7 discrete emotions from N=31 healthy adults. Per frame, standardized ARKit blend shapes were classified using a prototype-based cosine similarity metric; performance was summarized as accuracy and area under the receiver operating characteristic curves. Cosine similarity achieved an overall accuracy of 68.3
%K Humans
%K Emotions: physiology
%K Adult
%K Male
%K Female
%K Mobile Applications
%K Smartphone
%K Young Adult
%K Facial Expression
%K ARKit (Other)
%K emotion recognition (Other)
%K face tracking (Other)
%K real-time (Other)
%K sensors (Other)
%F PUB:(DE-HGF)16
%9 Journal Article
%$ pmid:41682575
%2 pmc:PMC12899966
%R 10.3390/s26031060
%U https://pub.dzne.de/record/285257