Hauptseite > Publikationsdatenbank > Parameter-efficient Bayesian Neural Networks for Uncertainty-aware Depth Estimation |
Preprint | FZJ-2024-06731 |
; ; ; ;
2024
arXiv
This record in other databases:
Please use a persistent id in citations: doi:10.48550/ARXIV.2409.17085 doi:10.34734/FZJ-2024-06731
Abstract: State-of-the-art computer vision tasks, like monocular depth estimation (MDE), rely heavily on large, modern Transformer-based architectures. However, their application in safety-critical domains demands reliable predictive performance and uncertainty quantification. While Bayesian neural networks provide a conceptually simple approach to serve those requirements, they suffer from the high dimensionality of the parameter space. Parameter-efficient fine-tuning (PEFT) methods, in particular low-rank adaptations (LoRA), have emerged as a popular strategy for adapting large-scale models to down-stream tasks by performing parameter inference on lower-dimensional subspaces. In this work, we investigate the suitability of PEFT methods for subspace Bayesian inference in large-scale Transformer-based vision models. We show that, indeed, combining BitFit, DiffFit, LoRA, and CoLoRA, a novel LoRA-inspired PEFT method, with Bayesian inference enables more robust and reliable predictive performance in MDE.
Keyword(s): Computer Vision and Pattern Recognition (cs.CV) ; Machine Learning (stat.ML) ; FOS: Computer and information sciences
![]() |
The record appears in these collections: |