000905638 001__ 905638
000905638 005__ 20220131120338.0
000905638 0247_ $$2arXiv$$aarXiv:2106.00116
000905638 0247_ $$2Handle$$a2128/30453
000905638 0247_ $$2altmetric$$aaltmetric:106857186
000905638 037__ $$aFZJ-2022-00865
000905638 1001_ $$0P:(DE-Juel1)180894$$aCherti, Mehdi$$b0$$eCorresponding author$$ufzj
000905638 1112_ $$aMedical Imaging Meets NeurIPS (MedNeurIPS)$$cSydney / online$$d2021-12-06 - 2021-12-14$$wAustralia
000905638 245__ $$aEffect of Pre-Training Scale on Intra- and Inter-Domain Full and Few-Shot Transfer Learning for Natural and Medical X-Ray Chest Images
000905638 260__ $$c2021
000905638 300__ $$a1-6
000905638 3367_ $$2ORCID$$aCONFERENCE_PAPER
000905638 3367_ $$033$$2EndNote$$aConference Paper
000905638 3367_ $$2BibTeX$$aINPROCEEDINGS
000905638 3367_ $$2DRIVER$$aconferenceObject
000905638 3367_ $$2DataCite$$aOutput Types/Conference Paper
000905638 3367_ $$0PUB:(DE-HGF)8$$2PUB:(DE-HGF)$$aContribution to a conference proceedings$$bcontrib$$mcontrib$$s1642783433_21910
000905638 520__ $$aTransfer learning aims to exploit pre-trained models for more efficient follow-up training on wide range of downstream tasks and datasets, enabling successful training also on small data. Recently, strong improvement was shown for transfer learning and model generalization when increasing model, data and compute budget scale in the pre-training. To compare effect of scale both in intra- and inter-domain full and few-shot transfer, in this study we combine for the first time large openly available medical X-Ray chest imaging datasets to reach a dataset scale comparable to ImageNet-1k. We then conduct pre-training and transfer to different natural or medical targets while varying network size and source data scale and domain, being either large natural (ImageNet-1k/21k) or large medical chest X-Ray datasets. We observe strong improvement due to larger pre-training scale for intra-domain natural-natural and medical-medical transfer. For inter-domain natural-medical transfer, we find improvements due to larger pre-training scale on larger X-Ray targets in full shot regime, while for smaller targets and for few-shot regime the improvement is not visible. Remarkably, large networks pre-trained on very large natural ImageNet-21k are as good or better than networks pre-trained on largest available medical X-Ray data when performing transfer to large X-Ray targets. We conclude that high quality models for inter-domain transfer can be also obtained by substantially increasing scale of model and generic natural source data, removing necessity for large domain-specific medical source data in the pre-training. Code is available at: \url{https://github.com/SLAMPAI/large-scale-pretraining-transfer}}
000905638 536__ $$0G:(DE-HGF)POF4-5112$$a5112 - Cross-Domain Algorithms, Tools, Methods Labs (ATMLs) and Research Groups (POF4-511)$$cPOF4-511$$fPOF IV$$x0
000905638 588__ $$aDataset connected to arXivarXiv
000905638 7001_ $$0P:(DE-Juel1)158080$$aJitsev, Jenia$$b1$$eCorresponding author$$ufzj
000905638 8564_ $$uhttp://www.cse.cuhk.edu.hk/~qdou/public/medneurips2021/21_effect_scale_transfer_final_camera_MedNeurIPS2021.pdf
000905638 8564_ $$uhttps://juser.fz-juelich.de/record/905638/files/21_effect_scale_transfer_final_camera_MedNeurIPS2021%281%29.pdf$$yOpenAccess
000905638 909CO $$ooai:juser.fz-juelich.de:905638$$pdnbdelivery$$pdriver$$pVDB$$popen_access$$popenaire
000905638 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)180894$$aForschungszentrum Jülich$$b0$$kFZJ
000905638 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)158080$$aForschungszentrum Jülich$$b1$$kFZJ
000905638 9131_ $$0G:(DE-HGF)POF4-511$$1G:(DE-HGF)POF4-510$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5112$$aDE-HGF$$bKey Technologies$$lEngineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action$$vEnabling Computational- & Data-Intensive Science and Engineering$$x0
000905638 9141_ $$y2021
000905638 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
000905638 920__ $$lyes
000905638 9201_ $$0I:(DE-Juel1)JSC-20090406$$kJSC$$lJülich Supercomputing Center$$x0
000905638 980__ $$acontrib
000905638 980__ $$aVDB
000905638 980__ $$aUNRESTRICTED
000905638 980__ $$aI:(DE-Juel1)JSC-20090406
000905638 9801_ $$aFullTexts