Julian Lemmel

Supervisor: Radu Grosu

Biologically Plausible Training Algorithms for Recurrent Neural Networks

 

Recurrent Neural Networks (RNNs) are widely used for Machine Learning on time-series data such as found in continuous control systems, language processing or weather and climate prediction. Although initially inspired by biological neural networks, RNNs are commonly considered biologically implausible. This is mainly due to the usage of Backpropagation Through Time (BPTT) as a means of training the networks, which relies on propagating exact gradients of a global loss function back to all upstream nodes in the network. As a consequence, this process requires biologically implausible weight sharing between forward and backward pathways. In contrast, biological learning is thought to be implemented by local plasticity rules that depend on low-dimensional reward signals transmitted through separate pathways with different weights. Recent work has examined a number of biologically plausible learning rules such as locally approximating gradients of the global loss function and the usage of random feedback weights. However, to date no competitive alternative to backpropagation has been found. A more biologically plausible learning algorithm may enable faster and more data-efficient training as well as provide insights into the inner workings of biological learning. This Research Proposal hints at models of synaptic plasticity paired with the concept of eligibility traces as a point of departure for further investigation into the topic.