001     1009239
005     20240313103129.0
037 _ _ |a FZJ-2023-02701
041 _ _ |a English
100 1 _ |a Nestler, Sandra
|0 P:(DE-Juel1)174585
|b 0
|e Corresponding author
|u fzj
111 2 _ |a Seminar Talk, Machine Learning Seminar
|c Aachen
|d 2023-07-13 - 2023-07-13
|w Germany
245 _ _ |a A statistical perspective on learning of time series in neural networks
|f 2023-07-13 -
260 _ _ |c 2023
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a Other
|2 DataCite
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a LECTURE_SPEECH
|2 ORCID
336 7 _ |a Talk (non-conference)
|b talk
|m talk
|0 PUB:(DE-HGF)31
|s 1689925979_19178
|2 PUB:(DE-HGF)
|x Invited
336 7 _ |a Other
|2 DINI
520 _ _ |a In this talk, we explore a statistical perspective on learning in neural networks, drawing inspiration from both neuroscience and machine learning. We investigate the stochastic nature of neural activity and stimuli and utilize tools from statistical physics to address these aspects. The focus lies on the time-dependent processing of stimuli.Recurrent neural networks, a concept inspired by the brain, handle time series naturally. For weakly non-linear interactions, a method is developed to approximate network dynamics, leading to improved performance in a random recurrent reservoir. For the scenario of linear interactions, we investigate how the optimal classifier balances stability and performance in the presence of background noise.We then study how non-linear interactions shape the statistical processing of stimuli, demonstrating a direct relationship between non-linearity, representation, and higher-order statistics using a single-layer perceptron. Moreover, we explore learning the data distribution itself, employing an invertible neural network (normalizing flows) to extract informative modes. This unsupervised approach uncovers underlying structure, dimensionality, and meaningful latent features in the data.
536 _ _ |a 5231 - Neuroscientific Foundations (POF4-523)
|0 G:(DE-HGF)POF4-5231
|c POF4-523
|f POF IV
|x 0
536 _ _ |a 5232 - Computational Principles (POF4-523)
|0 G:(DE-HGF)POF4-5232
|c POF4-523
|f POF IV
|x 1
536 _ _ |a 5234 - Emerging NC Architectures (POF4-523)
|0 G:(DE-HGF)POF4-5234
|c POF4-523
|f POF IV
|x 2
536 _ _ |a HBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)
|0 G:(EU-Grant)945539
|c 945539
|f H2020-SGA-FETFLAG-HBP-2019
|x 3
536 _ _ |a ACA - Advanced Computing Architectures (SO-092)
|0 G:(DE-HGF)SO-092
|c SO-092
|x 4
536 _ _ |a RenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)
|0 G:(DE-Juel-1)BMBF-01IS19077A
|c BMBF-01IS19077A
|x 5
536 _ _ |a SDS005 - Towards an integrated data science of complex natural systems (PF-JARA-SDS005)
|0 G:(DE-Juel-1)PF-JARA-SDS005
|c PF-JARA-SDS005
|x 6
536 _ _ |a neuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002)
|0 G:(DE-82)EXS-SF-neuroIC002
|c EXS-SF-neuroIC002
|x 7
536 _ _ |a GRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240)
|0 G:(GEPRIS)368482240
|c 368482240
|x 8
700 1 _ |a Keup, Christian
|0 P:(DE-Juel1)171384
|b 1
|u fzj
700 1 _ |a Dahmen, David
|0 P:(DE-Juel1)156459
|b 2
|u fzj
700 1 _ |a Gilson, Matthieu
|0 P:(DE-Juel1)184621
|b 3
700 1 _ |a Rauhut, Holger
|0 P:(DE-HGF)0
|b 4
700 1 _ |a Boutaib, Youness
|0 P:(DE-HGF)0
|b 5
700 1 _ |a Bouss, Peter
|0 P:(DE-Juel1)178725
|b 6
|u fzj
700 1 _ |a Merger, Claudia Lioba
|0 P:(DE-Juel1)184900
|b 7
|u fzj
700 1 _ |a Fischer, Kirsten
|0 P:(DE-Juel1)180150
|b 8
|u fzj
700 1 _ |a Rene, Alexandre
|0 P:(DE-Juel1)178936
|b 9
|u fzj
700 1 _ |a Helias, Moritz
|0 P:(DE-Juel1)144806
|b 10
|u fzj
856 4 _ |u https://juser.fz-juelich.de/record/1009239/files/A%20statistical%20perspective%20on%20learning%20of%20time%20series%20in%20neural%20networks.pptx
|y Restricted
909 C O |o oai:juser.fz-juelich.de:1009239
|p openaire
|p VDB
|p ec_fundedresources
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)174585
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)171384
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)156459
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 6
|6 P:(DE-Juel1)178725
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 7
|6 P:(DE-Juel1)184900
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 8
|6 P:(DE-Juel1)180150
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 9
|6 P:(DE-Juel1)178936
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 10
|6 P:(DE-Juel1)144806
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5231
|x 0
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5232
|x 1
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5234
|x 2
914 1 _ |y 2023
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)INM-6-20090406
|k INM-6
|l Computational and Systems Neuroscience
|x 0
920 1 _ |0 I:(DE-Juel1)IAS-6-20130828
|k IAS-6
|l Theoretical Neuroscience
|x 1
920 1 _ |0 I:(DE-Juel1)INM-10-20170113
|k INM-10
|l Jara-Institut Brain structure-function relationships
|x 2
980 _ _ |a talk
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)INM-6-20090406
980 _ _ |a I:(DE-Juel1)IAS-6-20130828
980 _ _ |a I:(DE-Juel1)INM-10-20170113
980 _ _ |a UNRESTRICTED
981 _ _ |a I:(DE-Juel1)IAS-6-20130828


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21