000281832 001__ 281832
000281832 005__ 20251113125815.0
000281832 0247_ $$2doi$$a10.1162/IMAG.a.960
000281832 0247_ $$2pmid$$apmid:41158555
000281832 0247_ $$2pmc$$apmc:PMC12556684
000281832 037__ $$aDZNE-2025-01213
000281832 041__ $$aEnglish
000281832 082__ $$a610
000281832 1001_ $$00009-0006-9724-0626$$aFortin, Marc-Antoine$$b0
000281832 245__ $$aGOUHFI: A novel contrast- and resolution-agnostic segmentation tool for ultra-high-field MRI.
000281832 260__ $$aCambridge, MA$$bMIT Press$$c2025
000281832 3367_ $$2DRIVER$$aarticle
000281832 3367_ $$2DataCite$$aOutput Types/Journal article
000281832 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1763035017_32675
000281832 3367_ $$2BibTeX$$aARTICLE
000281832 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000281832 3367_ $$00$$2EndNote$$aJournal Article
000281832 520__ $$aRecently, ultra-high-field MRI (UHF-MRI) has become more available and one of the best tools to study the brain for neuroscientists. One common step in quantitative neuroimaging is to segment the brain into several regions, which has been done using software packages such as FreeSurfer, FastSurferVINN, or SynthSeg. However, the differences between UHF-MRI and 1.5T or 3T images are such that the automatic segmentation techniques optimized at these field strengths usually produce unsatisfactory segmentation results for UHF images. Thus, it has been particularly challenging to perform region-based quantitative analyses as typically done with 1.5-3T data, considerably limiting the potential of UHF-MRI until now. Ultimately, this underscores the crucial need for developing new automatic segmentation techniques designed to handle UHF images. Hence, we propose a novel Deep Learning (DL)-based segmentation technique called GOUHFI: Generalized and Optimized segmentation tool for ultra-high-field images, designed to segment UHF images of various contrasts and resolutions. For training, we used a total of 206 label maps from four datasets acquired at 3T, 7T, and 9.4T. In contrast to most DL strategies, we used a previously proposed domain randomization approach, where synthetic images generated from the 206 label maps were used for training a 3D U-Net. This approach enables the DL model to become contrast agnostic. GOUHFI was tested on seven different datasets and compared with existing techniques such as FastSurferVINN, SynthSeg, and CEREBRUM-7T. GOUHFI was able to segment the six contrasts and seven resolutions tested at 3T, 7T, and 9.4T. Average Dice-Sørensen Similarity Coefficient (DSC) scores of 0.90, 0.90, and 0.93 were computed against the ground truth segmentations at 3T, 7T, and 9.4T, respectively. These results demonstrated GOUHFI's superior performance to competing approaches at each resolution and contrast level tested. Moreover, GOUHFI demonstrated impressive resistance to the typical inhomogeneities observed at UHF-MRI, making it a new powerful segmentation tool allowing the usual quantitative analysis pipelines performed at lower fields to be applied also at UHF. Ultimately, GOUHFI is a promising new segmentation tool, being the first of its kind proposing a contrast- and resolution-agnostic alternative for UHF-MRI without requiring fine tuning or retraining, making it the forthcoming alternative for neuroscientists working with UHF-MRI or even lower field strengths.
000281832 536__ $$0G:(DE-HGF)POF4-354$$a354 - Disease Prevention and Healthy Aging (POF4-354)$$cPOF4-354$$fPOF IV$$x0
000281832 588__ $$aDataset connected to CrossRef, PubMed, , Journals: pub.dzne.de
000281832 650_7 $$2Other$$aUHF-MRI
000281832 650_7 $$2Other$$abrain segmentation
000281832 650_7 $$2Other$$acontrast and resolution agnosticity
000281832 650_7 $$2Other$$adeep learning
000281832 650_7 $$2Other$$adomain randomization
000281832 650_7 $$2Other$$aneuroimaging
000281832 7001_ $$aKristoffersen, Anne Louise$$b1
000281832 7001_ $$aLarsen, Michael Staff$$b2
000281832 7001_ $$aLamalle, Laurent$$b3
000281832 7001_ $$0P:(DE-2719)2810697$$aStirnberg, Rüdiger$$b4$$udzne
000281832 7001_ $$0P:(DE-2719)9002873$$aGoa, Pal Erik$$b5$$udzne
000281832 773__ $$0PERI:(DE-600)3167925-0$$a10.1162/IMAG.a.960$$gVol. 3, p. IMAG.a.960$$pIMAG.a.960$$tImaging neuroscience$$v3$$x2837-6056$$y2025
000281832 8564_ $$uhttps://pub.dzne.de/record/281832/files/DZNE-2025-01213.pdf$$yOpenAccess
000281832 8564_ $$uhttps://pub.dzne.de/record/281832/files/DZNE-2025-01213.pdf?subformat=pdfa$$xpdfa$$yOpenAccess
000281832 9101_ $$0I:(DE-588)1065079516$$6P:(DE-2719)2810697$$aDeutsches Zentrum für Neurodegenerative Erkrankungen$$b4$$kDZNE
000281832 9101_ $$0I:(DE-HGF)0$$6P:(DE-2719)9002873$$aExternal Institute$$b5$$kExtern
000281832 9131_ $$0G:(DE-HGF)POF4-354$$1G:(DE-HGF)POF4-350$$2G:(DE-HGF)POF4-300$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$aDE-HGF$$bGesundheit$$lNeurodegenerative Diseases$$vDisease Prevention and Healthy Aging$$x0
000281832 9141_ $$y2025
000281832 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
000281832 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal$$d2024-09-26T09:40:26Z
000281832 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ$$d2024-09-26T09:40:26Z
000281832 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
000281832 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bDOAJ : Anonymous peer review$$d2024-09-26T09:40:26Z
000281832 915__ $$0StatID:(DE-HGF)0561$$2StatID$$aArticle Processing Charges$$d2025-01-02
000281832 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2025-01-02
000281832 915__ $$0StatID:(DE-HGF)0700$$2StatID$$aFees$$d2025-01-02
000281832 9201_ $$0I:(DE-2719)1013026$$kAG Stöcker$$lMR Physics$$x0
000281832 980__ $$ajournal
000281832 980__ $$aVDB
000281832 980__ $$aUNRESTRICTED
000281832 980__ $$aI:(DE-2719)1013026
000281832 9801_ $$aFullTexts