Memory models, dynamics of synaptic plasticity, neural network dynamics,
temporal processing in neural networks, the effect of noise in neural networks.
Noise, heterogeneity, and disorder can deteriorate the
performance of any communication substrate, either artificial
devices or complex systems like neural-networks. We study how to
exploit noise to better perform computational and memorization
tasks, how to model the noise sources, and which mechanism these
sources enhance/degrade. We isolate different components of noise
in networks: (1) structuralrandom connectivity and synaptic
efficacy distribution; (2) dynamicalinput noise to each neuron,
output dynamics. Noise is a natural property of any system: noise
is necessary in neural networks in order to improve learning,
retaining, and information removal, making room for new memories.
We investigate the optimal range of parameters to exploit the
stochasticity of the involved processes.
Topology can also became a source of noise. By tuning the
randomness of the connections we can enhance computational
abilities such as associative memory and separating output
representations of temporal input patterns.