Study Finds that Sleep Prevents Catastrophic Forgetting in Spiking Neural Networks

Study Finds that Sleep Prevents Catastrophic Forgetting in Spiking Neural Networks
Photo by Ava Bayley/ UCSD Guardian

A new study published by Maxim Bazhenov Lab at UC San Diego found that replicating sleep in spiking neural networks allows for these networks to learn new tasks without forgetting old ones.

Artificial neural networks contain multiple layers of interconnected computer-simulated neurons. According to IBM, the most basic ANNs have 3 layers: an input layer, a hidden layer, and an output layer. ANNs can operate through communication between the different neurons, and depending on what signals are received, a neuron may or may not transmit the signal to the next node. 

In this study, the authors used spiking neural networks. While the SNNs in this study have the same structure as an ANN, they are more similar to human neurons by communicating continuously through discrete electric events known as spikes. SNNs take into account the timing of the signals while ANNs do not.   

One of the major problems experimenters have while using SNNs is that they can only retain the ability to solve one problem or complete one task at a time. For example, the SNN in this study was trained to recognize patterns. If an SNN is taught to solve two types of problems in sequence, it will only retain the ability to solve the second type of problem. The first task is overwritten in favor of the second task. This is called “catastrophic forgetting”.

This problem can be solved by simultaneously teaching two tasks at once, through the concurrent use of the training data sets for both tasks. While improving what SNNs are capable of, this method does not replicate the continuous learning of the human brain. Humans are able to learn many tasks in sequence without forgetting how to do previous activities.

Fourth year PhD student and co-author, Erik Delanois, provided this analogy: “If you spend some time learning how to play golf and you take a break and learn how to play tennis, you’re not going to forget how to play golf just because you learned how to play tennis”. 

The researchers found that after a first task is learned, a second task can be learned if the learning period is mixed with periods of “sleep.” More specifically, the researchers developed and applied a model for REM, or rapid eye movement sleep. This type of sleep has been implicated in the consolidation of procedural memories. 

After the neurons learned one task, sleep was interwoven with the learning of a second task. Theoretically, this allows the SNN to replay the old task while concurrently learning the new task. At the end of the training period, the SNN was able to complete both tasks without any catastrophic forgetting and provided new insights to how REM sleep contributes to memory consolidation.

“It helps to visualize and understand what synapses could potentially be doing in biology. It’s a good example and illustration of what’s going on under the hood,” Delanois said.

The lead authors of the article include Delanois and Ryan Golden, Pavel Sanda from the institute of Computer Science of the Czech Academy of Sciences, and UCSD Professor Maxim Bazhenov.

Art by Ava Bayley for the UCSD Guardian.

View Comments (7)
About the Contributors
Chelsea Blankenchip, Contributing Writer
Chelsea is a doctorate student in UCSD's Biomedical Sciences Program who has a passion for science communication. Her research focuses on bacteria but she likes to write about all types of science.
Ava Bayley, Art Editor
As the Art Editor, Ava spends the majority of her time double majoring in Human Biology and Sociology, working as an EMT, and pursuing medicine as a career. Anything to avoid actually doing art.
More to Discover
Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Your donation will support the student journalists at University of California, San Diego. Your contribution will allow us to purchase equipment, keep printing our papers, and cover our annual website hosting costs.

Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Comments (7)

All The UCSD Guardian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *

  • J

    JeffLawsonNov 25, 2023 at 1:41 am

    Overall, this article is exceptional piece of writing that informs, inspires, and leaves a lasting impact on the reader.

    Reply
  • S

    Sloboda StudioJan 10, 2023 at 7:38 am

    Thanks for you job! In several fields, artificial neural networks are capable of superhuman performance. Despite these developments, these networks do not succeed at sequential learning; instead, they excel at more recent tasks at the expense of those that were previously learned. Animals and humans, on the other hand, both have a great capacity for lifelong learning and knowledge enlargement. By facilitating the spontaneous reactivation of previously established memory patterns, sleep has been proposed to play a significant role in memory and learning. Here, we show that interspersing new task training with sleep-like activity optimizes the network’s memory representation in synaptic weight space to prevent forgetting old memories. The model we use for this is a spiking neural network, which simulates sensory processing and reinforcement learning in animal brains. This is due to sleep. This is made possible by sleep repeating previous memory traces without explicitly using the previous task data.

    Reply
  • W

    world of marioDec 28, 2022 at 11:51 pm

    I have a great feeling when I read your articles. Love Ucsdguardian

    Reply
  • B

    BettyDec 8, 2022 at 2:56 am

    Thanks a lot for an interesting article. I am also very interested to create an app that can help people to improve their health. And I started searching for software development companies and I found the right one that can help me with that. So, I hope soon I can share with everybody my product.

    Reply
  • M

    MariaDec 6, 2022 at 8:36 am

    Do Work At your Home And Make Money From Home .,..
    Open This Link …>http://www.richsalaries4u.blogspot.com/

    Reply
  • E

    evaDec 5, 2022 at 12:09 pm

    Excellent work, Mike. I admire your work since because to one simple online job, I’m now making over $35,000 each month! I am aware that you are currently earning a sizable sum of money online from a starting capital of $28,000.

    simply click the link——————————>>> https://splendorousbolone.netlify.app

    Reply
  • B

    Barbara BlankenchipNov 30, 2022 at 7:27 pm

    I’m wondering if you are trying to have this program solve multiple problems at a time? The human brain seems to solve one task / problem at a time, are you trying to get this program to solve multiple problems/tasks consecutively? And if so, would sleep help the human brain do the same? So exciting

    Reply