001     1041674
005     20250505202224.0
037 _ _ |a FZJ-2025-02379
041 _ _ |a English
100 1 _ |a Bouss, Peter
|0 P:(DE-Juel1)178725
|b 0
|e Corresponding author
|u fzj
111 2 _ |a International Conference on Neuromorphic Computing and Engineering
|g ICNCE 2024
|c Aachen
|d 2024-06-03 - 2024-06-06
|w Germany
245 _ _ |a Exploring Neural Manifold Characteristics Using Adapted Normalizing Flows
260 _ _ |c 2024
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a CONFERENCE_POSTER
|2 ORCID
336 7 _ |a Output Types/Conference Poster
|2 DataCite
336 7 _ |a Poster
|b poster
|m poster
|0 PUB:(DE-HGF)24
|s 1746441742_14468
|2 PUB:(DE-HGF)
|x After Call
520 _ _ |a Despite the large number of active neurons in the cortex, the activity of neural populations fordifferent brain regions is expected to live on a low-dimensional manifold [1]. Variants of principalcomponent analysis (PCA) are frequently employed to estimate this manifold. However, thesemethods are limited by the assumption that the data conforms to a Gaussian distribution, neglectingadditional features such as the curvature of the manifold. Consequently, their performance asgenerative models tends to be subpar.To fully learn the statistics of neural activity and to generate artificial samples, we use NormalizingFlows (NFs) [2, 3]. These neural networks learn a dimension-preserving estimator of the probabilitydistribution of the data. They differ from other generative networks by their simplicity and by theirability to compute the likelihood exactly.Our adaptation of NFs focuses on distinguishing between relevant (in manifold) and noisedimensions (out of manifold). This is achieved by training the NF to represent maximal datavariance representation in minimal dimensions, akin to PCA's linear model but allowing fornonlinear mappings. Our adaptation allows us to estimate the dimensionality of the neural manifold.As every layer is a bijective mapping, the network can describe the manifold without losinginformation – a distinctive advantage of NFs.We validate our adaptation on artificial datasets of varying complexity where the underlyingdimensionality is known. Our approach can reconstruct data using only a few latent variables, and ismore efficient than linear methods, such as PCA.Following this approach, we identify manifolds in electrophysiological recordings from macaqueV1 and V4 [4]. Our approach faithfully represents not only the variance but also higher orderfeatures, such as the skewness and kurtosis of the data, using fewer dimensions than PCA.[1] J. Gallego et al., Neuron, 94, 5, 978-984, 2017.[2] L. Dinh et al., ICLR, 2015.[3] L. Dinh et al., ICLR, 2017.[4] X. Chen et al., Sci. Data, 9, 1, 77, 2022.
536 _ _ |a 5232 - Computational Principles (POF4-523)
|0 G:(DE-HGF)POF4-5232
|c POF4-523
|f POF IV
|x 0
536 _ _ |a 5234 - Emerging NC Architectures (POF4-523)
|0 G:(DE-HGF)POF4-5234
|c POF4-523
|f POF IV
|x 1
536 _ _ |a GRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240)
|0 G:(GEPRIS)368482240
|c 368482240
|x 2
536 _ _ |a RenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)
|0 G:(DE-Juel-1)BMBF-01IS19077A
|c BMBF-01IS19077A
|x 3
700 1 _ |a Nestler, Sandra
|0 P:(DE-HGF)0
|b 1
700 1 _ |a Fischer, Kirsten
|0 P:(DE-Juel1)180150
|b 2
|u fzj
700 1 _ |a Merger, Claudia Lioba
|0 P:(DE-Juel1)184900
|b 3
700 1 _ |a Rene, Alexandre
|0 P:(DE-HGF)0
|b 4
700 1 _ |a Helias, Moritz
|0 P:(DE-Juel1)144806
|b 5
|u fzj
909 C O |o oai:juser.fz-juelich.de:1041674
|p VDB
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)178725
910 1 _ |a RWTH Aachen
|0 I:(DE-588b)36225-6
|k RWTH
|b 0
|6 P:(DE-Juel1)178725
910 1 _ |a Technion, Haifa, Israel
|0 I:(DE-HGF)0
|b 1
|6 P:(DE-HGF)0
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)180150
910 1 _ |a RWTH Aachen
|0 I:(DE-588b)36225-6
|k RWTH
|b 2
|6 P:(DE-Juel1)180150
910 1 _ |a SISSA, Trieste, Italy
|0 I:(DE-HGF)0
|b 3
|6 P:(DE-Juel1)184900
910 1 _ |a RWTH Aachen
|0 I:(DE-588b)36225-6
|k RWTH
|b 4
|6 P:(DE-HGF)0
910 1 _ |a University of Ottawa, Canada
|0 I:(DE-HGF)0
|b 4
|6 P:(DE-HGF)0
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 5
|6 P:(DE-Juel1)144806
910 1 _ |a RWTH Aachen
|0 I:(DE-588b)36225-6
|k RWTH
|b 5
|6 P:(DE-Juel1)144806
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5232
|x 0
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5234
|x 1
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)IAS-6-20130828
|k IAS-6
|l Computational and Systems Neuroscience
|x 0
980 _ _ |a poster
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)IAS-6-20130828
980 _ _ |a UNRESTRICTED


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21