Skip to Content
Categories:

Study Finds that Sleep Prevents Catastrophic Forgetting in Spiking Neural Networks

Study Finds that Sleep Prevents Catastrophic Forgetting in Spiking Neural Networks
Image by Ava Bayley for The UCSD Guardian

A new study published by Maxim Bazhenov Lab at UC San Diego found that replicating sleep in spiking neural networks allows for these networks to learn new tasks without forgetting old ones.

Artificial neural networks contain multiple layers of interconnected computer-simulated neurons. According to IBM, the most basic ANNs have 3 layers: an input layer, a hidden layer, and an output layer. ANNs can operate through communication between the different neurons, and depending on what signals are received, a neuron may or may not transmit the signal to the next node. 

In this study, the authors used spiking neural networks. While the SNNs in this study have the same structure as an ANN, they are more similar to human neurons by communicating continuously through discrete electric events known as spikes. SNNs take into account the timing of the signals while ANNs do not.   

One of the major problems experimenters have while using SNNs is that they can only retain the ability to solve one problem or complete one task at a time. For example, the SNN in this study was trained to recognize patterns. If an SNN is taught to solve two types of problems in sequence, it will only retain the ability to solve the second type of problem. The first task is overwritten in favor of the second task. This is called “catastrophic forgetting”.

This problem can be solved by simultaneously teaching two tasks at once, through the concurrent use of the training data sets for both tasks. While improving what SNNs are capable of, this method does not replicate the continuous learning of the human brain. Humans are able to learn many tasks in sequence without forgetting how to do previous activities.

Fourth year PhD student and co-author, Erik Delanois, provided this analogy: “If you spend some time learning how to play golf and you take a break and learn how to play tennis, you’re not going to forget how to play golf just because you learned how to play tennis”. 

The researchers found that after a first task is learned, a second task can be learned if the learning period is mixed with periods of “sleep.” More specifically, the researchers developed and applied a model for REM, or rapid eye movement sleep. This type of sleep has been implicated in the consolidation of procedural memories. 

After the neurons learned one task, sleep was interwoven with the learning of a second task. Theoretically, this allows the SNN to replay the old task while concurrently learning the new task. At the end of the training period, the SNN was able to complete both tasks without any catastrophic forgetting and provided new insights to how REM sleep contributes to memory consolidation.

“It helps to visualize and understand what synapses could potentially be doing in biology. It’s a good example and illustration of what’s going on under the hood,” Delanois said.

The lead authors of the article include Delanois and Ryan Golden, Pavel Sanda from the institute of Computer Science of the Czech Academy of Sciences, and UCSD Professor Maxim Bazhenov.

Art by Ava Bayley for the UCSD Guardian.

About the Contributors
Chelsea Blankenchip, Contributing Writer
Chelsea is a doctorate student in UCSD's Biomedical Sciences Program who has a passion for science communication. Her research focuses on bacteria but she likes to write about all types of science.
Ava Bayley
Ava Bayley, Art Editor
As the Art Editor, Ava spends the majority of her time double majoring in Human Biology and Sociology, working as an EMT, and pursuing medicine as a career. Anything to avoid actually doing art.
Donate to The UCSD Guardian
$2615
$5000
Contributed
Our Goal

Your donation will support the student journalists at University of California, San Diego. Your contribution will allow us to purchase equipment, keep printing our papers, and cover our annual website hosting costs.

More to Discover
Donate to The UCSD Guardian
$2615
$5000
Contributed
Our Goal