Journal Article FZJ-2020-01552

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Capacity of the covariance perceptron

 ;  ;

2020
IOP Publ. Bristol

Journal of physics / A 53(35), 354002 () [10.1088/1751-8121/ab82dd]

This record in other databases:    

Please use a persistent id in citations:   doi:

Abstract: The classical perceptron is a simple neural network that performs a binary classification by a linear mapping between static inputs and outputs and application of a threshold. For small inputs, neural networks in a stationary state also perform an effectively linear input-output transformation, but of an entire time series. Choosing the temporal mean of the time series as the feature for classification, the linear transformation of the network with subsequent thresholding is equivalent to the classical perceptron. Here we show that choosing covariances of time series as the feature for classification maps the neural network to what we call a 'covariance perceptron'; a mapping between covariances that is bilinear in terms of weights. By extending Gardner's theory of connections to this bilinear problem, using a replica symmetric mean-field theory, we compute the pattern and information capacities of the covariance perceptron in the infinite-size limit. Closed-form expressions reveal superior pattern capacity in the binary classification task compared to the classical perceptron in the case of a high-dimensional input and low-dimensional output. For less convergent networks, the mean perceptron classifies a larger number of stimuli. However, since covariances span a much larger input and output space than means, the amount of stored information in the covariance perceptron exceeds the classical counterpart. For strongly convergent connectivity it is superior by a factor equal to the number of input neurons. Theoretical calculations are validated numerically for finite size systems using a gradient-based optimization of a soft-margin, as well as numerical solvers for the NP hard quadratically constrained quadratic programming problem, to which training can be mapped.

Classification:

Contributing Institute(s):
  1. Computational and Systems Neuroscience (INM-6)
  2. Theoretical Neuroscience (IAS-6)
  3. Jara-Institut Brain structure-function relationships (INM-10)
Research Program(s):
  1. 571 - Connectivity and Activity (POF3-571) (POF3-571)
  2. 574 - Theory, modelling and simulation (POF3-574) (POF3-574)
  3. MSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018) (HGF-SMHB-2014-2018)
  4. HBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907) (785907)
  5. neuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002) (EXS-SF-neuroIC002)

Appears in the scientific report 2020
Database coverage:
Medline ; Creative Commons Attribution CC BY 4.0 ; OpenAccess ; Clarivate Analytics Master Journal List ; Current Contents - Physical, Chemical and Earth Sciences ; Ebsco Academic Search ; IF < 5 ; JCR ; National-Konsortium ; NationallizenzNationallizenz ; SCOPUS ; Science Citation Index ; Science Citation Index Expanded ; Web of Science Core Collection
Click to display QR Code for this record

The record appears in these collections:
Dokumenttypen > Aufsätze > Zeitschriftenaufsätze
Institutssammlungen > INM > INM-10
Institutssammlungen > IAS > IAS-6
Institutssammlungen > INM > INM-6
Workflowsammlungen > Öffentliche Einträge
Workflowsammlungen > Publikationsgebühren
Publikationsdatenbank
Open Access

 Datensatz erzeugt am 2020-03-20, letzte Änderung am 2024-03-13