Conference Presentation (After Call) FZJ-2025-03853

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Summation Compression for Very-Low Rank Adaptation

 ;  ;  ;  ;  ;  ;

2025

Helmholtz AI Conference, RWTH AachenKarlsruhe, RWTH Aachen, Germany, 3 Jun 2025 - 5 Jun 20252025-06-032025-06-05

Abstract: Parameter-Efficient Fine-Tuning (PEFT) methods have transformed the approach to fine-tuning large models for downstream tasks by enabling the adjustment of significantly fewer parameters than those in the original model matrices. In this work, we study the "very low rank regime", where we fine-tune the lowest amount of parameters per linear layer for each considered PEFT method. We propose 1LoRA (Summation Low-Rank Adaptation), a compute, parameter and memory efficient fine-tuning method which uses the feature sum as fixed compression and a single trainable vector as decompression. Differently from state-of-the-art PEFT methods like LoRA, VeRA, and the recent MoRA, 1LoRA uses fewer parameters per layer, reducing the memory footprint and the computational cost. We extensively evaluate our method against state-of-the-art PEFT methods on multiple fine-tuning tasks, and show that our method not only outperforms them, but is also more parameter, memory and computationally efficient. Moreover, thanks to its memory efficiency, 1LoRA allows to fine-tune more evenly across layers, instead of focusing on specific ones (e.g. attention layers), improving performance further.


Contributing Institute(s):
  1. Datenanalyse und Maschinenlernen (IAS-8)
  2. Computational and Systems Neuroscience (IAS-6)
Research Program(s):
  1. 5112 - Cross-Domain Algorithms, Tools, Methods Labs (ATMLs) and Research Groups (POF4-511) (POF4-511)

Appears in the scientific report 2025
Click to display QR Code for this record

The record appears in these collections:
Dokumenttypen > Präsentationen > Konferenzvorträge
Institutssammlungen > IAS > IAS-6
Institutssammlungen > IAS > IAS-8
Workflowsammlungen > Öffentliche Einträge
Publikationsdatenbank

 Datensatz erzeugt am 2025-09-23, letzte Änderung am 2025-10-13



Dieses Dokument bewerten:

Rate this document:
1
2
3
 
(Bisher nicht rezensiert)