Navigation auf


Institute of Neuroinformatics


Resistive memory as a configurable structure for electronics

One great knob to “morph” the properties of electronic substrates is resistance. Resistive memory devices (RRAMs) are thus the ideal element for this purpose.  
These devices are emerging memories whose physical structure and thus resistance changes as a function of the input they receive. 

The configurability of the resistance of this memory can implement the functionality of brain-inspired systems at different levels of the spatial hierarchy, from synapses and neurons to dendritic arbor and connectivity, and in the past years we have designed example circuits for these very purpose:

Resistive memory as synaptic weight elements in an "in-memory" architecture.

Resistive memory has been extensively used to implement synapses in neural networks, in "crossbar" like architectures, for in-memory computing, where resistive memory holds the weights of the neural networks, ie. their synapses. As inputs are applied as a voltage pulse to the conductance of the resistive memory holding the weight, through Ohm's law, the multiplication between input and weight is "naturally" implemented. 

M. Payvand, M. V. Nair, L. K. Müller and G. Indiveri, "A neuromorphic systems approach to in-memory computing with non-ideal memristive devices: From mitigation to exploitation", Faraday Discussions, 2019

F. Moro, et al, Hardware calibrated learning to compensate heterogeneity in analog RRAM-based Spiking Neural Networks, International Symposium on Circuits and Systems (ISCAS), 2022

Resistive memory as neuron parameters

Neurons have intenral paramteres, that governs their dynamics. These can include the time constant of the neuron, its gain, refractory period, and in more complex models, adaptation dynamics. These parameters can be stored locally with resistive memory, which at the same time give rise to the "emulation" of neural dynamics. 

M. Payvand, F. Moro, K. Nomura, T. Dalgaty, E. Vianello, Y. Nishi Y, G. Indiveri, “Self-organization of an inhomogeneous memristive hardware for sequence learning”. Nature communications, 2022

Resistive memory as both delays and weights in dendritic arbor

Resitive memory can present both temporal parameters, delays, and spatial parameters, weights, in Spiking Neural Network hardware.  Events that arrive first get delayed through the RC element implemented by the first resistive memory in parallel with a small capactor, and then get weighted a second resistive memory. 

M. Payvand, S. D’Agostino, F. Moro, Y. Demirag, G. Indiveri, E. Vianello, “Dendritic Computation through Exploiting Resistive Memory as both Delays and Weights” (2023)


Resistive memory as connectivity

Resistive memory can also be used to store the connectivity of the networks. Other than local connectivity, which is implemented by the crossbar, the information routing between the crossbars is usually implemented through Network On Chip (NOC) design in most of the large scale neuromorphic hardware platforms. Through implementing this connectivity with resisitve memories, they can be used to physically pass or block an event from one core to another. This gives rise to a physical implementation of small-world graphs.

Dalgaty*, F. Moro*, Y. Demirağ*, A. De Pra, G. Indiveri, E. Vianello, and M. Payvand. "The neuromorphic Mosaic: in-memory computing and routing for small-world graphical networks." (2023).

Resistive memory for on-chip learning

Adapting the resistance of the RRAM based on the streaming sensory information enables on-chip online learning. Online learning is the next big leap for reducing the power and memory costs associated with training neural networks. Instead of storing large datasets, reading them from memory and updating the parameters of the network on large batches, online learning instead aims to change the network parameters as the information streams to the on-chip network. Short-term applications for such on-chip learning includes personalization of wearable devices to the 

As the sensory information spans across temporal scales, “learning” to change the RRAMs should also happen across multiple temporal scales. In the brains, there is a direct correlation between the spatial scale of the computational module and the temporal scale at which it changes. 

Therefore, the incorporation of resistive memory in the different levels of the hierarchy, enables on-chip configurability and online learning at different temporal time scales. 

In the past, we have worked on mechanism for online turning of the memory devices at the synaptic, neuronal and dendrite levels.

Synaptic plasticity: changing the value of the RRAM devices online by modulating the compliance current

M Payvand, Y Demirag, T Dalgaty, E Vianello, G Indiveri, “Analog Weight Updates with Compliance Current Modulation of Binary ReRAMs for On-Chip Learning

Synaptic plasticity: changing the value of the RRAM devices online in a Spiking Recurrent Neural Network

Neuronal Plasticity: Changing the value of the RRAM incorporated in the design of the neuron, to keep the neuron's firing rate in a desirable range of activity, for a balanced activity in the recurrent network.

M. Payvand, F. Moro, K. Nomura, T. Dalgaty, E. Vianello, Y. Nishi Y, G. Indiveri, “Self-organization of an inhomogeneous memristive hardware for sequence learning”. Nature communications, 2022

Dendritic plasticity: Using the Dendritic structure of the neuron to locally generate an error signal and use it to update the synaptic weights

Matteo Cartiglia*, Arianna Rubino*, Shyam Narayanan, Charlotte Frenkel, Germain Haessig, Giacomo Indiveri, Melika Payvand, "Stochastic dendrites enable online learning in mixed-signal neuromorphic processing systems", ISCAS 2022

Main challenges for online learning

There are three main challenges associated with online learning:
Spatial credit assignment, temporal credit assignment and limited bit precision. We have worked on solving these issues in the following works:

Error-triggered learning:
We proposed a solution for solving the spatial credit assignment problem, by using the local information of pre and postsynaptic neuron, along with an error-encoded event as the third factor to implement backpropagation:

Melika Payvand∗, Mohammed E. Fouda∗, Fadi Kurdahi, Ahmed M. Eltawil, and Emre O. Neftci, "On-chip error-triggered learning of multi-layer memristive spiking neural networks"


We proposed a solution for solving the temporal credit assignment, by exploiting the drift of the Phase Change Memory in their High Resistive State for implementing Eligbility Traces (ETs). ETs act like long temporal filters which accumulate the gradient information on a temporal signal for the duration of their time constant. Implementing such second-time constant temporal filters in requires large capacitors on silicon, and thus employing a solution with resistive memory saves a lot of silicon real estate!

Yig ̆it Demirag, Filippo Moro, Thomas Dalgaty, Gabriele Navarro, Charlotte Frenkel, Giacomo Indiveri, Elisa Vianello, and Melika Payvand, "PCM-trace: Scalable Synaptic Eligibility Traces with Resistivity Drift of Phase-Change Materials"