000275937 001__ 275937
000275937 005__ 20250121165737.0
000275937 0247_ $$2doi$$a10.1038/s43856-024-00701-w
000275937 0247_ $$2pmid$$apmid:39809877
000275937 0247_ $$2pmc$$apmc:PMC11733215
000275937 037__ $$aDZNE-2025-00159
000275937 041__ $$aEnglish
000275937 1001_ $$00000-0003-3390-6418$$aKulvicius, Tomas$$b0
000275937 245__ $$aDeep learning empowered sensor fusion boosts infant movement classification.
000275937 260__ $$a[London]$$bSpringer Nature$$c2025
000275937 3367_ $$2DRIVER$$aarticle
000275937 3367_ $$2DataCite$$aOutput Types/Journal article
000275937 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1737451650_5211
000275937 3367_ $$2BibTeX$$aARTICLE
000275937 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000275937 3367_ $$00$$2EndNote$$aJournal Article
000275937 520__ $$aTo assess the integrity of the developing nervous system, the Prechtl general movement assessment (GMA) is recognized for its clinical value in diagnosing neurological impairments in early infancy. GMA has been increasingly augmented through machine learning approaches intending to scale-up its application, circumvent costs in the training of human assessors and further standardize classification of spontaneous motor patterns. Available deep learning tools, all of which are based on single sensor modalities, are however still considerably inferior to that of well-trained human assessors. These approaches are hardly comparable as all models are designed, trained and evaluated on proprietary/silo-data sets.With this study we propose a sensor fusion approach for assessing fidgety movements (FMs). FMs were recorded from 51 typically developing participants. We compared three different sensor modalities (pressure, inertial, and visual sensors). Various combinations and two sensor fusion approaches (late and early fusion) for infant movement classification were tested to evaluate whether a multi-sensor system outperforms single modality assessments. Convolutional neural network (CNN) architectures were used to classify movement patterns.The performance of the three-sensor fusion (classification accuracy of 94.5%) is significantly higher than that of any single modality evaluated.We show that the sensor fusion approach is a promising avenue for automated classification of infant motor patterns. The development of a robust sensor fusion system may significantly enhance AI-based early recognition of neurofunctions, ultimately facilitating automated early detection of neurodevelopmental conditions.
000275937 536__ $$0G:(DE-HGF)POF4-352$$a352 - Disease Mechanisms (POF4-352)$$cPOF4-352$$fPOF IV$$x0
000275937 588__ $$aDataset connected to CrossRef, PubMed, , Journals: pub.dzne.de
000275937 7001_ $$00000-0001-6050-7426$$aZhang, Dajie$$b1
000275937 7001_ $$aPoustka, Luise$$b2
000275937 7001_ $$00000-0002-4579-4970$$aBölte, Sven$$b3
000275937 7001_ $$00000-0001-6954-600X$$aJahn, Lennart$$b4
000275937 7001_ $$aFlügge, Sarah$$b5
000275937 7001_ $$aKraft, Marc$$b6
000275937 7001_ $$0P:(DE-2719)2810591$$aZweckstetter, Markus$$b7
000275937 7001_ $$00000-0002-1742-5211$$aNielsen-Saines, Karin$$b8
000275937 7001_ $$00000-0001-8206-9738$$aWörgötter, Florentin$$b9
000275937 7001_ $$00000-0001-8932-0980$$aMarschik, Peter B$$b10
000275937 773__ $$0PERI:(DE-600)3096949-9$$a10.1038/s43856-024-00701-w$$gVol. 5, no. 1, p. 16$$n1$$p16$$tCommunications medicine$$v5$$x2730-664X$$y2025
000275937 8564_ $$uhttps://pub.dzne.de/record/275937/files/DZNE-2025-00159%20SUP.zip
000275937 8564_ $$uhttps://pub.dzne.de/record/275937/files/DZNE-2025-00159.pdf$$yOpenAccess
000275937 8564_ $$uhttps://pub.dzne.de/record/275937/files/DZNE-2025-00159.pdf?subformat=pdfa$$xpdfa$$yOpenAccess
000275937 909CO $$ooai:pub.dzne.de:275937$$popenaire$$popen_access$$pVDB$$pdriver$$pdnbdelivery
000275937 9101_ $$0I:(DE-588)1065079516$$6P:(DE-2719)2810591$$aDeutsches Zentrum für Neurodegenerative Erkrankungen$$b7$$kDZNE
000275937 9131_ $$0G:(DE-HGF)POF4-352$$1G:(DE-HGF)POF4-350$$2G:(DE-HGF)POF4-300$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$aDE-HGF$$bGesundheit$$lNeurodegenerative Diseases$$vDisease Mechanisms$$x0
000275937 9141_ $$y2025
000275937 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)1050$$2StatID$$aDBCoverage$$bBIOSIS Previews$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)1190$$2StatID$$aDBCoverage$$bBiological Abstracts$$d2024-12-28
000275937 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
000275937 915__ $$0StatID:(DE-HGF)0112$$2StatID$$aWoS$$bEmerging Sources Citation Index$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal$$d2024-04-10T15:36:49Z
000275937 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ$$d2024-04-10T15:36:49Z
000275937 915__ $$0StatID:(DE-HGF)0700$$2StatID$$aFees$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
000275937 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bDOAJ : Open peer review, Anonymous peer review, Double anonymous peer review$$d2024-04-10T15:36:49Z
000275937 915__ $$0StatID:(DE-HGF)0561$$2StatID$$aArticle Processing Charges$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2024-12-28
000275937 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2024-12-28
000275937 9201_ $$0I:(DE-2719)1410001$$kAG Zweckstetter$$lTranslational Structural Biology$$x0
000275937 980__ $$ajournal
000275937 980__ $$aVDB
000275937 980__ $$aUNRESTRICTED
000275937 980__ $$aI:(DE-2719)1410001
000275937 9801_ $$aFullTexts