In-memory computing (IMC) is attracting a strong interest for hardware accelerators of neural networks for artificial intelligence (AI) applications. To that aim, high density memory arrays are used as artificial synaptic arrays, storing the weights of the neural network and performing the matrix-vector multiplication (MVM) involved in the network operation. Within these implementations, in-situ update of the weights can be achieved during network training, thus avoiding power-hungry data movement. For training applications, a key requirement for synaptic devices is the capability to operate at low current to avoid large area of the peripheral circuitry and excessive current-resistance (IR) drop. Also, high linearity of weight update is necessary for accelerating the outer product for online training by backpropagation. To meet all these demands, in this work we present a novel synaptic memory device based on interface-state trapping in MOS transistors with a 2D MoS2 channel. By operating the device in the deep subthreshold regime, a very low (few nS) and linearly updatable conductance with pulses of equal amplitude is demonstrated. Simulations of neural network training show an accuracy of 96.8% for MNIST, close to the floating-point accuracy of 97.8%.

Low-current, highly linear synaptic memory device based on MoS2 transistors for online training and inference

Farronato, Matteo;Ricci, Saverio;Compagnoni, Christian Monzio;Ielmini, Daniele
2022-01-01

Abstract

In-memory computing (IMC) is attracting a strong interest for hardware accelerators of neural networks for artificial intelligence (AI) applications. To that aim, high density memory arrays are used as artificial synaptic arrays, storing the weights of the neural network and performing the matrix-vector multiplication (MVM) involved in the network operation. Within these implementations, in-situ update of the weights can be achieved during network training, thus avoiding power-hungry data movement. For training applications, a key requirement for synaptic devices is the capability to operate at low current to avoid large area of the peripheral circuitry and excessive current-resistance (IR) drop. Also, high linearity of weight update is necessary for accelerating the outer product for online training by backpropagation. To meet all these demands, in this work we present a novel synaptic memory device based on interface-state trapping in MOS transistors with a 2D MoS2 channel. By operating the device in the deep subthreshold regime, a very low (few nS) and linearly updatable conductance with pulses of equal amplitude is demonstrated. Simulations of neural network training show an accuracy of 96.8% for MNIST, close to the floating-point accuracy of 97.8%.
2022
2022 IEEE 4th International Conference on Artificial Intelligence Circuits and Systems (AICAS)
978-1-6654-0996-4
File in questo prodotto:
File Dimensione Formato  
2022_aicas.pdf

Accesso riservato

: Publisher’s version
Dimensione 1.56 MB
Formato Adobe PDF
1.56 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1221130
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 1
social impact