| Hauptseite > Workflowsammlungen > Publikationsgebühren > Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions |
| Typ | Amount | VAT | Currency | Share | Status | Cost centre |
| Hybrid-OA | 1483.14 | 0.00 | EUR | 100.00 % | (Zahlung erfolgt) | 40800/E.40401.61 |
| Sum | 1483.14 | 0.00 | EUR | |||
| Total | 1483.14 |
| Journal Article | FZJ-2021-04016 |
; ;
2021
APS
College Park, Md.
This record in other databases:
Please use a persistent id in citations: http://hdl.handle.net/2128/28916 doi:10.1103/PhysRevLett.127.158302
Abstract: We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.
|
The record appears in these collections: |