001041674 001__ 1041674
001041674 005__ 20250505202224.0
001041674 037__ $$aFZJ-2025-02379
001041674 041__ $$aEnglish
001041674 1001_ $$0P:(DE-Juel1)178725$$aBouss, Peter$$b0$$eCorresponding author$$ufzj
001041674 1112_ $$aInternational Conference on Neuromorphic Computing and Engineering$$cAachen$$d2024-06-03 - 2024-06-06$$gICNCE 2024$$wGermany
001041674 245__ $$aExploring Neural Manifold Characteristics Using Adapted Normalizing Flows
001041674 260__ $$c2024
001041674 3367_ $$033$$2EndNote$$aConference Paper
001041674 3367_ $$2BibTeX$$aINPROCEEDINGS
001041674 3367_ $$2DRIVER$$aconferenceObject
001041674 3367_ $$2ORCID$$aCONFERENCE_POSTER
001041674 3367_ $$2DataCite$$aOutput Types/Conference Poster
001041674 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1746441742_14468$$xAfter Call
001041674 520__ $$aDespite the large number of active neurons in the cortex, the activity of neural populations fordifferent brain regions is expected to live on a low-dimensional manifold [1]. Variants of principalcomponent analysis (PCA) are frequently employed to estimate this manifold. However, thesemethods are limited by the assumption that the data conforms to a Gaussian distribution, neglectingadditional features such as the curvature of the manifold. Consequently, their performance asgenerative models tends to be subpar.To fully learn the statistics of neural activity and to generate artificial samples, we use NormalizingFlows (NFs) [2, 3]. These neural networks learn a dimension-preserving estimator of the probabilitydistribution of the data. They differ from other generative networks by their simplicity and by theirability to compute the likelihood exactly.Our adaptation of NFs focuses on distinguishing between relevant (in manifold) and noisedimensions (out of manifold). This is achieved by training the NF to represent maximal datavariance representation in minimal dimensions, akin to PCA's linear model but allowing fornonlinear mappings. Our adaptation allows us to estimate the dimensionality of the neural manifold.As every layer is a bijective mapping, the network can describe the manifold without losinginformation – a distinctive advantage of NFs.We validate our adaptation on artificial datasets of varying complexity where the underlyingdimensionality is known. Our approach can reconstruct data using only a few latent variables, and ismore efficient than linear methods, such as PCA.Following this approach, we identify manifolds in electrophysiological recordings from macaqueV1 and V4 [4]. Our approach faithfully represents not only the variance but also higher orderfeatures, such as the skewness and kurtosis of the data, using fewer dimensions than PCA.[1] J. Gallego et al., Neuron, 94, 5, 978-984, 2017.[2] L. Dinh et al., ICLR, 2015.[3] L. Dinh et al., ICLR, 2017.[4] X. Chen et al., Sci. Data, 9, 1, 77, 2022.
001041674 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001041674 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x1
001041674 536__ $$0G:(GEPRIS)368482240$$aGRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240)$$c368482240$$x2
001041674 536__ $$0G:(DE-Juel-1)BMBF-01IS19077A$$aRenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)$$cBMBF-01IS19077A$$x3
001041674 7001_ $$0P:(DE-HGF)0$$aNestler, Sandra$$b1
001041674 7001_ $$0P:(DE-Juel1)180150$$aFischer, Kirsten$$b2$$ufzj
001041674 7001_ $$0P:(DE-Juel1)184900$$aMerger, Claudia Lioba$$b3
001041674 7001_ $$0P:(DE-HGF)0$$aRene, Alexandre$$b4
001041674 7001_ $$0P:(DE-Juel1)144806$$aHelias, Moritz$$b5$$ufzj
001041674 909CO $$ooai:juser.fz-juelich.de:1041674$$pVDB
001041674 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)178725$$aForschungszentrum Jülich$$b0$$kFZJ
001041674 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)178725$$aRWTH Aachen$$b0$$kRWTH
001041674 9101_ $$0I:(DE-HGF)0$$6P:(DE-HGF)0$$a Technion, Haifa, Israel$$b1
001041674 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)180150$$aForschungszentrum Jülich$$b2$$kFZJ
001041674 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)180150$$aRWTH Aachen$$b2$$kRWTH
001041674 9101_ $$0I:(DE-HGF)0$$6P:(DE-Juel1)184900$$a SISSA, Trieste, Italy$$b3
001041674 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-HGF)0$$aRWTH Aachen$$b4$$kRWTH
001041674 9101_ $$0I:(DE-HGF)0$$6P:(DE-HGF)0$$a University of Ottawa, Canada$$b4
001041674 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144806$$aForschungszentrum Jülich$$b5$$kFZJ
001041674 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)144806$$aRWTH Aachen$$b5$$kRWTH
001041674 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001041674 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
001041674 920__ $$lyes
001041674 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lComputational and Systems Neuroscience$$x0
001041674 980__ $$aposter
001041674 980__ $$aVDB
001041674 980__ $$aI:(DE-Juel1)IAS-6-20130828
001041674 980__ $$aUNRESTRICTED