001     1033893
005     20241217215531.0
024 7 _ |a 10.48550/ARXIV.2409.17085
|2 doi
024 7 _ |a 10.34734/FZJ-2024-06731
|2 datacite_doi
037 _ _ |a FZJ-2024-06731
100 1 _ |a Paul, Richard D.
|0 P:(DE-Juel1)175101
|b 0
|e Corresponding author
|u fzj
245 _ _ |a Parameter-efficient Bayesian Neural Networks for Uncertainty-aware Depth Estimation
260 _ _ |c 2024
|b arXiv
336 7 _ |a Preprint
|b preprint
|m preprint
|0 PUB:(DE-HGF)25
|s 1734418575_31226
|2 PUB:(DE-HGF)
336 7 _ |a WORKING_PAPER
|2 ORCID
336 7 _ |a Electronic Article
|0 28
|2 EndNote
336 7 _ |a preprint
|2 DRIVER
336 7 _ |a ARTICLE
|2 BibTeX
336 7 _ |a Output Types/Working Paper
|2 DataCite
500 _ _ |a Presented as an Extended Abstract at the 3rd Workshop on Uncertainty Quantification for Computer Vision at the ECCV'24.
520 _ _ |a State-of-the-art computer vision tasks, like monocular depth estimation (MDE), rely heavily on large, modern Transformer-based architectures. However, their application in safety-critical domains demands reliable predictive performance and uncertainty quantification. While Bayesian neural networks provide a conceptually simple approach to serve those requirements, they suffer from the high dimensionality of the parameter space. Parameter-efficient fine-tuning (PEFT) methods, in particular low-rank adaptations (LoRA), have emerged as a popular strategy for adapting large-scale models to down-stream tasks by performing parameter inference on lower-dimensional subspaces. In this work, we investigate the suitability of PEFT methods for subspace Bayesian inference in large-scale Transformer-based vision models. We show that, indeed, combining BitFit, DiffFit, LoRA, and CoLoRA, a novel LoRA-inspired PEFT method, with Bayesian inference enables more robust and reliable predictive performance in MDE.
536 _ _ |a 5112 - Cross-Domain Algorithms, Tools, Methods Labs (ATMLs) and Research Groups (POF4-511)
|0 G:(DE-HGF)POF4-5112
|c POF4-511
|f POF IV
|x 0
588 _ _ |a Dataset connected to DataCite
650 _ 7 |a Computer Vision and Pattern Recognition (cs.CV)
|2 Other
650 _ 7 |a Machine Learning (stat.ML)
|2 Other
650 _ 7 |a FOS: Computer and information sciences
|2 Other
700 1 _ |a Quercia, Alessio
|0 P:(DE-Juel1)188471
|b 1
|u fzj
700 1 _ |a Fortuin, Vincent
|0 P:(DE-HGF)0
|b 2
700 1 _ |a Nöh, Katharina
|0 P:(DE-Juel1)129051
|b 3
|u fzj
700 1 _ |a Scharr, Hanno
|0 P:(DE-Juel1)129394
|b 4
|u fzj
773 _ _ |a 10.48550/ARXIV.2409.17085
856 4 _ |u https://arxiv.org/abs/2409.17085
856 4 _ |u https://juser.fz-juelich.de/record/1033893/files/2409.17085v1.pdf
|y OpenAccess
909 C O |o oai:juser.fz-juelich.de:1033893
|p openaire
|p open_access
|p VDB
|p driver
|p dnbdelivery
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)175101
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)188471
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)129051
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 4
|6 P:(DE-Juel1)129394
913 1 _ |a DE-HGF
|b Key Technologies
|l Engineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action
|1 G:(DE-HGF)POF4-510
|0 G:(DE-HGF)POF4-511
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Enabling Computational- & Data-Intensive Science and Engineering
|9 G:(DE-HGF)POF4-5112
|x 0
914 1 _ |y 2024
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)IAS-8-20210421
|k IAS-8
|l Datenanalyse und Maschinenlernen
|x 0
920 1 _ |0 I:(DE-Juel1)IBG-1-20101118
|k IBG-1
|l Biotechnologie
|x 1
980 _ _ |a preprint
980 _ _ |a VDB
980 _ _ |a UNRESTRICTED
980 _ _ |a I:(DE-Juel1)IAS-8-20210421
980 _ _ |a I:(DE-Juel1)IBG-1-20101118
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21