Recurrently connected continuous-time rate-based neural networks
Ahokainen, Iiro (2021)
Ahokainen, Iiro
2021
Tekniikan ja luonnontieteiden kandidaattiohjelma - Bachelor's Programme in Engineering and Natural Sciences
Tekniikan ja luonnontieteiden tiedekunta - Faculty of Engineering and Natural Sciences
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Hyväksymispäivämäärä
2021-12-17
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202112119128
https://urn.fi/URN:NBN:fi:tuni-202112119128
Tiivistelmä
One of the central questions in neuroscience is how neurons and neuron populations communicate with each other. An approach amongst others is to model networks of neurons and populations as a continuous nonlinear dynamical system. Continuous-time rate-based recurrent neural networks (RNNs) offer a way to model neural networks as time-dependent nonlinear dynamical systems. First RNNs were developed several decades ago but recent developments on theoretical frameworks concerning neural manifolds and new optimization techniques motivate to study RNNs in a new perspective.
In this thesis, the theoretical background of RNNs was studied from the neuroscience perspective and the dynamics of RNNs was briefly analyzed with local stability analysis. Several RNNs were studied while the focus was concentrated towards the two most used RNNs; the additive system and the Wilson-Cowan system. The main difference between the models is how fast they respond to postsynaptic currents. It was found that the additive system and the Wilson-Cowan system exhibit same fixed points, and the asymptotic stability near the hyperbolic fixed points is same for both models. This finding explains why the models behave similarly when external inputs are time-invariant. In addition to the additive system and to the Wilson-Cowan system, other RNN variants were briefly studied from a theoretical perspective explaining what is the relation between neuroscience and the RNNs.
Altogether, the results of the thesis help to understand the dynamics of the systems and guide the selection of RNNs on neuroscientifically interesting computational tasks which include modelling of the brain's network dynamics. The RNNs combined with dimension reduction methods can reveal low-dimensional neural manifolds that may arguably explain fundamental brain mechanisms.
In this thesis, the theoretical background of RNNs was studied from the neuroscience perspective and the dynamics of RNNs was briefly analyzed with local stability analysis. Several RNNs were studied while the focus was concentrated towards the two most used RNNs; the additive system and the Wilson-Cowan system. The main difference between the models is how fast they respond to postsynaptic currents. It was found that the additive system and the Wilson-Cowan system exhibit same fixed points, and the asymptotic stability near the hyperbolic fixed points is same for both models. This finding explains why the models behave similarly when external inputs are time-invariant. In addition to the additive system and to the Wilson-Cowan system, other RNN variants were briefly studied from a theoretical perspective explaining what is the relation between neuroscience and the RNNs.
Altogether, the results of the thesis help to understand the dynamics of the systems and guide the selection of RNNs on neuroscientifically interesting computational tasks which include modelling of the brain's network dynamics. The RNNs combined with dimension reduction methods can reveal low-dimensional neural manifolds that may arguably explain fundamental brain mechanisms.
Kokoelmat
- Kandidaatintutkielmat [8918]