001     1025142
005     20241128210253.0
024 7 _ |a 10.34734/FZJ-2024-02719
|2 datacite_doi
037 _ _ |a FZJ-2024-02719
041 _ _ |a English
100 1 _ |a Tran, Viet Anh Khoa
|0 P:(DE-Juel1)192408
|b 0
|e Corresponding author
111 2 _ |a Computational and Systems Neuroscience 2024
|g COSYNE 2024
|c Lisbon
|d 2024-02-29 - 2024-03-03
|w Portugal
245 _ _ |a Continual learning using dendritic modulations on view-invariant feedforward weights
260 _ _ |c 2024
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a CONFERENCE_POSTER
|2 ORCID
336 7 _ |a Output Types/Conference Poster
|2 DataCite
336 7 _ |a Poster
|b poster
|m poster
|0 PUB:(DE-HGF)24
|s 1732775529_9614
|2 PUB:(DE-HGF)
|x After Call
520 _ _ |a The brain is remarkably adept at learning from a continuous stream of data without significantlyforgetting previously learnt skills. Conventional machine learning models struggle at continual learn-ing, as weight updates that optimize the current task interfere with previously learnt tasks. A simpleremedy to catastrophic forgetting is freezing a network pretrained on a set of base tasks, and trainingtask-specific readouts on this shared trunk. However, this assumes that representations in the frozennetwork are separable under new tasks, therefore leading to sub-par performance. To continually learnon novel task data, previous methods suggest weight consolidation – preserving weights that are mostimpactful for the performance of previous tasks – and memory-based approaches – where the networkis allowed to see a subset of images from previous tasks.For biological networks, prior work showed that dendritic top-down modulations provide a powerfulmechanism to learn novel tasks while initial feedforward weights solely extract generic view-invariantfeatures. Therefore, we propose a continual learner that optimizes the feedforward weights towardsview-invariant representations while training task-specific modulations towards separable class clus-ters. In a task-incremental setting, we train feedforward weights using a self-supervised algorithm,while training the task-specific modulations and readouts in a supervised fashion, both exclusivelythrough current-task data. We show that this simple approach avoids catastrophic forgetting of classclusters, as opposed to training the whole network in a supervised manner, while also outperforming(a) task-specific readout without modulations and (b) frozen feedforward weights. This suggests that(a) top-down modulations are necessary and sufficient to shift the representations towards separableclusters and that (b) the SSL objective learns novel features based on the newly presented objectswhile maintaining features relevant to previous tasks, without requiring specific synaptic consolidationmechanisms.
536 _ _ |a 5234 - Emerging NC Architectures (POF4-523)
|0 G:(DE-HGF)POF4-5234
|c POF4-523
|f POF IV
|x 0
536 _ _ |a Functional Neural Architectures (jinm60_20190501)
|0 G:(DE-Juel1)jinm60_20190501
|c jinm60_20190501
|f Functional Neural Architectures
|x 1
536 _ _ |a WestAI - AI Service Center West (01IS22094B)
|0 G:(BMBF)01IS22094B
|c 01IS22094B
|x 2
700 1 _ |a Neftci, Emre
|0 P:(DE-Juel1)188273
|b 1
700 1 _ |a Wybo, Willem
|0 P:(DE-Juel1)186881
|b 2
856 4 _ |y OpenAccess
|u https://juser.fz-juelich.de/record/1025142/files/Cosyne2024Abstract.pdf
856 4 _ |y OpenAccess
|x icon
|u https://juser.fz-juelich.de/record/1025142/files/Cosyne2024Abstract.gif?subformat=icon
856 4 _ |y OpenAccess
|x icon-1440
|u https://juser.fz-juelich.de/record/1025142/files/Cosyne2024Abstract.jpg?subformat=icon-1440
856 4 _ |y OpenAccess
|x icon-180
|u https://juser.fz-juelich.de/record/1025142/files/Cosyne2024Abstract.jpg?subformat=icon-180
856 4 _ |y OpenAccess
|x icon-640
|u https://juser.fz-juelich.de/record/1025142/files/Cosyne2024Abstract.jpg?subformat=icon-640
909 C O |o oai:juser.fz-juelich.de:1025142
|p openaire
|p open_access
|p VDB
|p driver
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)192408
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)188273
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)186881
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5234
|x 0
914 1 _ |y 2024
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)PGI-15-20210701
|k PGI-15
|l Neuromorphic Software Eco System
|x 0
980 _ _ |a poster
980 _ _ |a VDB
980 _ _ |a UNRESTRICTED
980 _ _ |a I:(DE-Juel1)PGI-15-20210701
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21