Skip to main content

Spiking Reservoir Networks: Brain-inspired recurrent algorithms that use random, fixed synaptic strengths

By
Nicholas Soures; Dhireesha Kudithipudi

A class of brain-inspired recurrent algorithms known as reservoir computing (RC) networks reduces the computational complexity and cost of training machine-learning models by using random, fixed synaptic strengths. This article offers insights about a spiking reservoir network, the liquid state machine (LSM), the inner workings of the algorithm, the design metrics, and neuromorphic designs. The discussion extends to variations of the LSM that incorporate local plasticity mechanisms and hierarchy to improve performance and memory capacity.

Read on IEEE Xplore