001041672 001__ 1041672
001041672 005__ 20250505202224.0
001041672 0247_ $$2doi$$a10.12751/NNCN.BC2024.179
001041672 037__ $$aFZJ-2025-02377
001041672 041__ $$aEnglish
001041672 1001_ $$0P:(DE-Juel1)178725$$aBouss, Peter$$b0$$eCorresponding author$$ufzj
001041672 1112_ $$aBernstein Conference 2024$$cFrankfurt$$d2024-09-29 - 2024-10-02$$wGermany
001041672 245__ $$aAssessing Neural Manifold Properties With Adapted Normalizing Flows
001041672 260__ $$c2024
001041672 3367_ $$033$$2EndNote$$aConference Paper
001041672 3367_ $$2BibTeX$$aINPROCEEDINGS
001041672 3367_ $$2DRIVER$$aconferenceObject
001041672 3367_ $$2ORCID$$aCONFERENCE_POSTER
001041672 3367_ $$2DataCite$$aOutput Types/Conference Poster
001041672 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1746441722_14470$$xAfter Call
001041672 520__ $$aDespite the large number of active neurons in the cortex, the activity of neuronal populations is expected to lie on a low-dimensional manifold for different brain regions [1]. Variants of principal component analysis (PCA) are commonly used to assess this manifold. However, these methods are limited by the assumption that the data follows a Gaussian distribution and neglect additional features such as the curvature of the manifold. Hence, their performance as generative models tends to be subpar.To construct a generative model that entirely learns the statistics of neural activity with no assumptions about its distribution, we use Normalizing Flows (NFs) [2, 3]. These neural networks learn an estimator of the probability distribution of the data, based on a latent distribution of the same dimension. Their simplicity and their ability to compute the exact likelihood distinguish them from other generative networks.Our adaptation of NFs focuses on distinguishing between relevant (in manifold) and noise dimensions (out of manifold). We achieve this by identifying principal axes in the latent space. Similar to PCA, we order those axes based on their explanatory power, where we use reconstruction performance instead of explained variance to identify and rank the principal axes. This idea was also explored in [4] with a different loss function. Our adaptation allows us to investigate the behavior of the non-linear principal axes and thus the geometry on which the data lie. This is done by approximating the network for better interpretability as a quadratic mapping around the maximum likelihood modes.We validate our adaptation on artificial data sets of varying complexity where the underlying dimensionality is known. This shows that our approach is able to reconstruct data with only a few latent variables. In this regard it is more efficient than PCA, in addition to achieving a higher likelihood.We apply the method to electrophysiological recordings of V1 and V4 in macaques [5], which have previously been analyzed with a Gaussian Mixture Model [6]. We show that the data lie on a manifold that features two distinct regions, each corresponding to one of the two states, eyes-open and eyes-closed. The shape of the manifold significantly deviates from a Gaussian distribution and thus would not be recoverable with PCA. We further analyze how the non-linear interaction between groups of neurons contributes to the shape of the manifolds.Figure 1: We use Normalizing Flows to learn the distribution of data mapping it to a Gaussian distribution in latent space. Thereby, we enforce an alignment of the latent dimensions to the most informative non-linear axes.
001041672 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001041672 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x1
001041672 536__ $$0G:(GEPRIS)368482240$$aGRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240)$$c368482240$$x2
001041672 536__ $$0G:(DE-Juel-1)BMBF-01IS19077A$$aRenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)$$cBMBF-01IS19077A$$x3
001041672 588__ $$aDataset connected to DataCite
001041672 650_7 $$2Other$$aComputational Neuroscience
001041672 650_7 $$2Other$$aData analysis, machine learning and neuroinformatics
001041672 7001_ $$0P:(DE-HGF)0$$aNestler, Sandra$$b1
001041672 7001_ $$0P:(DE-Juel1)180150$$aFischer, Kirsten$$b2$$ufzj
001041672 7001_ $$0P:(DE-Juel1)184900$$aMerger, Claudia Lioba$$b3
001041672 7001_ $$0P:(DE-HGF)0$$aRené, Alexandre$$b4
001041672 7001_ $$0P:(DE-Juel1)144806$$aHelias, Moritz$$b5$$ufzj
001041672 773__ $$a10.12751/NNCN.BC2024.179
001041672 8564_ $$uhttps://abstracts.g-node.org/conference/BC24/abstracts#/uuid/934bfa09-f657-42cb-bada-30cd023ad81e
001041672 909CO $$ooai:juser.fz-juelich.de:1041672$$pVDB
001041672 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)178725$$aForschungszentrum Jülich$$b0$$kFZJ
001041672 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)178725$$aRWTH Aachen$$b0$$kRWTH
001041672 9101_ $$0I:(DE-HGF)0$$6P:(DE-HGF)0$$a Technion, Haifa, Israel$$b1
001041672 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)180150$$aForschungszentrum Jülich$$b2$$kFZJ
001041672 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)180150$$aRWTH Aachen$$b2$$kRWTH
001041672 9101_ $$0I:(DE-HGF)0$$6P:(DE-Juel1)184900$$a SISSA, Trieste, Italy$$b3
001041672 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-HGF)0$$aRWTH Aachen$$b4$$kRWTH
001041672 9101_ $$0I:(DE-HGF)0$$6P:(DE-HGF)0$$a University of Ottawa, Canada$$b4
001041672 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144806$$aForschungszentrum Jülich$$b5$$kFZJ
001041672 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)144806$$aRWTH Aachen$$b5$$kRWTH
001041672 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001041672 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
001041672 920__ $$lyes
001041672 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lComputational and Systems Neuroscience$$x0
001041672 980__ $$aposter
001041672 980__ $$aVDB
001041672 980__ $$aI:(DE-Juel1)IAS-6-20130828
001041672 980__ $$aUNRESTRICTED