The common understanding of long term memory is that it is stored in the synaptic connections between neurons in such a way that memory retrieval occurs as the relaxation of the neural activity to a constant spiking pattern, that represents the memory. This idea was put forward by Hopfield (1982) and others as the attractor neural network. Synaptic dynamics challenges this mechanism, since persistent pre-synaptic activity typically weakens the synaptic strength. The inclusion of short-term synaptic plasticity in an attractor neural network make memories metastable states that rapidly switch from one state to the next, depending on the sensory context. This work provides some insights on the puzzle how the brain, viewed as a dynamical system, is able to build stable representations of the world and at the same time is capable to effortlessly switch between them. With (J. Torres, University of Granada).

Control theory is the obvious theoretical framework to describe the dynamics of animal behavior, whether this is the movement of limbs or more abstract cognitive planning. However, the computation of the controls require an accurate model of the 'plant' (the system that must be controlled) and a substrate to store the state dependent optimal control. We investigate how neural networks can be used for both of these tasks. For instance, it is known that Echo state networks can accurately learn complex dynamical systems. We train these networks to learn the dynamics of the plant. In addition, we show that the optimal control solution can also be represented effectively in these networks.

Action selection in growing state spaces:
control of network structure growth.

Journal of Physics A,
vol. 50,
pp. 1-21,
2017
Particle smoothing for hidden diffusion processes: adaptive path integral smoother.

IEEE Transactions on Signal Processing,
vol. 65,
pp. 3391-3203,
2017
Journal of Statistical Physics,
vol. 162,
pp. 1244-1266,
2016

Plos Computational Biology,
vol. 12,
no. 6,
pp. 29-58,
2016

Optimal control of network structure growth.

NIPS workshop on Advances in Approximate Bayesian Inference,
2016
All SNN publications