001007805 001__ 1007805
001007805 005__ 20250603202305.0
001007805 0247_ $$2doi$$a10.12751/NNCN.BC2022.104
001007805 0247_ $$2datacite_doi$$a10.34734/FZJ-2023-02199
001007805 037__ $$aFZJ-2023-02199
001007805 041__ $$aEnglish
001007805 1001_ $$0P:(DE-Juel1)178725$$aBouss, Peter$$b0$$eCorresponding author$$ufzj
001007805 1112_ $$aBernstein Conference$$cBerlin$$d2022-09-13 - 2022-09-16$$wGermany
001007805 245__ $$aDimensionality reduction with normalizing flows
001007805 260__ $$c2022
001007805 3367_ $$033$$2EndNote$$aConference Paper
001007805 3367_ $$2BibTeX$$aINPROCEEDINGS
001007805 3367_ $$2DRIVER$$aconferenceObject
001007805 3367_ $$2ORCID$$aCONFERENCE_POSTER
001007805 3367_ $$2DataCite$$aOutput Types/Conference Poster
001007805 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1748950919_19036$$xAfter Call
001007805 500__ $$aCopyright: © (2022) Bouss P, Nestler S, René A, Helias M
001007805 502__ $$cRWTH Aachen
001007805 520__ $$aDespite the large number of active neurons in the cortex, for various brain regions, the activity of neural populations is expected to live on a low-dimensional manifold [1]. Among the most common tools to estimate the mapping to this manifold, along with its dimension, are many variants of principal component analysis [2]. Despite their apparent success, these procedures have the disadvantage that they assume only linear correlations and that their performance, when used as a generative model, is poor.To be able to fully learn the statistics of neural activity and to generate artificial samples, we make use of normalizing flows (NFs) [3, 4, 5]. These neural networks learn a dimension-preserving estimator of the data probability distribution. They are outstanding in comparison to generative adversarial networks (GANs) and variational autoencoders (VAEs) for their simplicity ‒ only one invertible network is learned ‒ and for their exact estimation of the likelihood due to tractable Jacobians at each building block.We aim to modify NFs such that they can discriminate relevant (in manifold) from noise (out of manifold) dimensions. To this end, we penalize the participation of each single latent variable in the reconstruction of the data through the inverse mapping (following a different reasoning than [6]). We can thus not only give an estimate of the dimensionality of the activity sub-space but also describe the underlying manifold without the need to discard any information.We prove the validity of our modification on controlled data sets of different complexity. We emphasize, in particular, differences between affine and additive coupling layers in normalizing flows [7], and show that the former lead to pathologies when the data topology is non-trivial, or when the data set is composed of classes with different volumes. We further illustrate the power of our modified NFs by reconstructing data using only a few dimensions.We finally apply this technique to identify manifolds in EEG recordings from a dataset showing high gamma activity (described in [8]), obtained from 128 electrodes during four different movement tasks.AcknowledgementsThis project is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) - 368482240/GRK2416; and by the German Federal Ministry for Education and Research (BMBF Grant 01IS19077A to Jülich).References    [1] Gao, P., Trautmann, E., Yu, B., Santhanam, G., Ryu, S., Shenoy, K., & Ganguli, S. (2017). A theory of multineuronal dimensionality, dynamics and measurement. BioRxiv, 214262., 10.1101/214262    [2] Gallego, J. A., Perich, M. G., Miller, L. E., & Solla, S. A. (2017). Neural manifolds for the control of movement. Neuron, 94(5), 978-984., 10.1016/j.neuron.2017.05.025    [3] Dinh, L., Krueger, D., & Bengio, Y. (2014). Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516., 10.48550/arXiv.1410.8516    [4] Dinh, L., Sohl-Dickstein, J., & Bengio, S. (2016). Density estimation using real nvp. arXiv preprint arXiv:1605.08803., 10.48550/arXiv.1605.08803    [5] Kingma, D. P., & Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions. Advances in neural information processing systems, 31.    [6] Cunningham, E., Cobb, A., & Jha, S. (2022). Principal manifold flows. arXiv preprint arXiv:2202.07037., 10.48550/arXiv.2202.07037    [7] Behrmann, J., Vicol, P., Wang, K. C., Grosse, R., & Jacobsen, J. H. (2021). Understanding and mitigating exploding inverses in invertible neural networks. In International Conference on Artificial Intelligence and Statistics (pp. 1792-1800). PMLR.    [8] Schirrmeister, R. T., Springenberg, J. T., Fiederer, L. D. J., Glasstetter, M., Eggensperger, K., Tangermann, M., ... & Ball, T. (2017). Deep learning with convolutional neural networks for EEG decoding and visualization. Human brain mapping, 38(11), 5391-5420., 10.1002/hbm.23730
001007805 536__ $$0G:(DE-HGF)POF4-5231$$a5231 - Neuroscientific Foundations (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001007805 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x1
001007805 536__ $$0G:(GEPRIS)368482240$$aGRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240)$$c368482240$$x2
001007805 536__ $$0G:(DE-Juel-1)BMBF-01IS19077A$$aRenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)$$cBMBF-01IS19077A$$x3
001007805 588__ $$aDataset connected to DataCite
001007805 650_7 $$2Other$$aComputational Neuroscience
001007805 650_7 $$2Other$$aData analysis, machine learning, neuroinformatics
001007805 7001_ $$0P:(DE-Juel1)174585$$aNestler, Sandra$$b1$$ufzj
001007805 7001_ $$0P:(DE-Juel1)178936$$aRene, Alexandre$$b2$$ufzj
001007805 7001_ $$0P:(DE-Juel1)144806$$aHelias, Moritz$$b3$$eLast author$$ufzj
001007805 773__ $$a10.12751/NNCN.BC2022.104
001007805 8564_ $$uhttp://doi.org/10.12751/nncn.bc2022.104
001007805 8564_ $$uhttps://juser.fz-juelich.de/record/1007805/files/Abstract.pdf$$yRestricted
001007805 8564_ $$uhttps://juser.fz-juelich.de/record/1007805/files/Poster.pdf$$yOpenAccess
001007805 909CO $$ooai:juser.fz-juelich.de:1007805$$popen_access$$popenaire$$pdriver$$pVDB
001007805 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)178725$$aForschungszentrum Jülich$$b0$$kFZJ
001007805 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)178725$$aRWTH Aachen$$b0$$kRWTH
001007805 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)174585$$aForschungszentrum Jülich$$b1$$kFZJ
001007805 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)174585$$aRWTH Aachen$$b1$$kRWTH
001007805 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)178936$$aForschungszentrum Jülich$$b2$$kFZJ
001007805 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)178936$$aRWTH Aachen$$b2$$kRWTH
001007805 9101_ $$0I:(DE-HGF)0$$6P:(DE-Juel1)178936$$a University of Ottawa$$b2
001007805 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144806$$aForschungszentrum Jülich$$b3$$kFZJ
001007805 9101_ $$0I:(DE-588b)36225-6$$6P:(DE-Juel1)144806$$aRWTH Aachen$$b3$$kRWTH
001007805 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5231$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001007805 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
001007805 9141_ $$y2023
001007805 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001007805 920__ $$lyes
001007805 9201_ $$0I:(DE-Juel1)INM-6-20090406$$kINM-6$$lComputational and Systems Neuroscience$$x0
001007805 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lComputational and Systems Neuroscience$$x1
001007805 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x2
001007805 980__ $$aposter
001007805 980__ $$aVDB
001007805 980__ $$aI:(DE-Juel1)INM-6-20090406
001007805 980__ $$aI:(DE-Juel1)IAS-6-20130828
001007805 980__ $$aI:(DE-Juel1)INM-10-20170113
001007805 980__ $$aUNRESTRICTED
001007805 9801_ $$aFullTexts
001007805 981__ $$aI:(DE-Juel1)IAS-6-20130828