In-memory computing (IMC) is attracting a strong interest for hardware accelerators of neural networks for artificial intelligence (AI) applications. To that aim, high density memory arrays are used as artificial synaptic arrays, storing the weights of the neural network and performing the matrix-vector multiplication (MVM) involved in the network operation. Within these implementations, in-situ update of the weights can be achieved during network training, thus avoiding power-hungry data movement. For training applications, a key requirement for synaptic devices is the capability to operate at low current to avoid large area of the peripheral circuitry and excessive current-resistance (IR) drop. Also, high linearity of weight update is necessary for accelerating the outer product for online training by backpropagation. To meet all these demands, in this work we present a novel synaptic memory device based on interface-state trapping in MOS transistors with a 2D MoS2 channel. By operating the device in the deep subthreshold regime, a very low (few nS) and linearly updatable conductance with pulses of equal amplitude is demonstrated. Simulations of neural network training show an accuracy of 96.8% for MNIST, close to the floating-point accuracy of 97.8%.
Low-current, highly linear synaptic memory device based on MoS2 transistors for online training and inference
Farronato, Matteo;Ricci, Saverio;Compagnoni, Christian Monzio;Ielmini, Daniele
2022-01-01
Abstract
In-memory computing (IMC) is attracting a strong interest for hardware accelerators of neural networks for artificial intelligence (AI) applications. To that aim, high density memory arrays are used as artificial synaptic arrays, storing the weights of the neural network and performing the matrix-vector multiplication (MVM) involved in the network operation. Within these implementations, in-situ update of the weights can be achieved during network training, thus avoiding power-hungry data movement. For training applications, a key requirement for synaptic devices is the capability to operate at low current to avoid large area of the peripheral circuitry and excessive current-resistance (IR) drop. Also, high linearity of weight update is necessary for accelerating the outer product for online training by backpropagation. To meet all these demands, in this work we present a novel synaptic memory device based on interface-state trapping in MOS transistors with a 2D MoS2 channel. By operating the device in the deep subthreshold regime, a very low (few nS) and linearly updatable conductance with pulses of equal amplitude is demonstrated. Simulations of neural network training show an accuracy of 96.8% for MNIST, close to the floating-point accuracy of 97.8%.File | Dimensione | Formato | |
---|---|---|---|
2022_aicas.pdf
Accesso riservato
:
Publisher’s version
Dimensione
1.56 MB
Formato
Adobe PDF
|
1.56 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.