Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as spike-timing-dependent plasticity (STDP) and neural redundancy are shown to enable continual learning and prevent catastrophic forgetting without compromising standard accuracy achievable with state-of-the-art neural networks. Unsupervised learning by STDP is demonstrated by hardware experiments with a one-layer perceptron adopting phase-change memory (PCM) synapses. Finally, we demonstrate full testing classification of Modified National Institute of Standards and Technology (MNIST) database with an accuracy of 98% and continual learning of up to 30% non-trained classes with 83% average accuracy.

Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks

Munoz-Martin I.;Bianchi S.;Pedretti G.;Melnic O.;Ambrogio S.;Ielmini D.
2019-01-01

Abstract

Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as spike-timing-dependent plasticity (STDP) and neural redundancy are shown to enable continual learning and prevent catastrophic forgetting without compromising standard accuracy achievable with state-of-the-art neural networks. Unsupervised learning by STDP is demonstrated by hardware experiments with a one-layer perceptron adopting phase-change memory (PCM) synapses. Finally, we demonstrate full testing classification of Modified National Institute of Standards and Technology (MNIST) database with an accuracy of 98% and continual learning of up to 30% non-trained classes with 83% average accuracy.
2019
Catastrophic forgetting; continual learning; convolutional neural network (CNN); neuromorphic engineering; phase-change memory (PCM); spike-timing-dependent plasticity (STDP); supervised learning; unsupervised learning
File in questo prodotto:
File Dimensione Formato  
jxcdc19.pdf

accesso aperto

: Publisher’s version
Dimensione 4.95 MB
Formato Adobe PDF
4.95 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1098188
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 24
social impact