In recent years, the rapid growth of data, internet connectivity, and digital technologies has driven an urgent need for processing systems that combine high performance with energy efficiency. Artificial intelligence (AI) has emerged as a powerful tool, leveraging deep learning models like convolutional neural networks (CNNs) and large language models (LLMs) to process vast data while adapting to dynamic environments. However, these models require significant hardware resources, especially for training, leading to unsustainable energy consumption [1]. Bio-inspired computing (BIC) presents a promising solution to these challenges by emulating biological brain functions to enable more efficient computation. Indeed, the human brain is capable of an extraordinary range of operations, such as recognizing images, understanding and generating language, coordinating complex body movements, making decisions, and learning from past experiences. What makes these abilities important is the brain's energy efficiency, which is much better with respect to modern computing architectures. Moreover, it achieves this through an exceptionally high degree of parallelism [2]. This remarkable efficiency stems from the colocalization of memory and processing within the brain, which is in contradiction with traditional computing architectures [3]. In biological systems, the fundamental building blocks are neurons and synapses: neurons are specialized cells that can sense, generate, propagate, and respond to electrical signals [4], while synapses serve as dynamic communication junctions between neurons. Together, they form vast, intricate networks that underpin all cognitive functions. Due to all these excellent characteristics, BIC has been successfully applied across various domains (Figure 1a). This growing interest has driven extensive research into innovative devices designed to replicate neural processing with enhanced efficiency. Devices such as resistive random-access memories (RRAMs), phase change memories (PCMs), three-dimensional crosspoint (3DXP) memories, and electrochemical random-access memories (ECRAMs) are increasingly recognized as promising candidates for brain-inspired computing (BIC) applications. More recently, devices based on two-dimensional (2D) materials have garnered significant attention due to their exceptional potential for device scaling and integration into advanced electronic architectures. Indeed, 2D materials have intrinsic atomic-scale thickness, remarkable electronic properties and cost-effective integration schemes, which make them perfect candidate as active materials for artificial neuron and synapses. In parallel with the advancement of device technology, the introduction of neuromorphic engineering has conceptualized BIC architectures. Inspired by biological sensory processing, a variety of approaches have emerged-including artificial neural networks (ANNs), spiking neural networks (SNNs), and the development of biomimetic systems such as silicon retinae, silicon cochleae, and other in-sensor computing platforms. Recently, the paradigm of reservoir computing has emerged as a promising approach for lowpower and real-time signal processing. The brain-inspired nature of this approach makes it ideal for application in biomedical fields, such as seizure detection (Figure 1b) [5]. Despite significant progress, the development of BIC continues to face several critical challenges. These span across technological limitations, architectural design complexities, and the translation of concepts into practical, real-world applications. Each of these areas remains a focus of active research and innovation, with ongoing efforts to overcome the barriers and unlock the full potential of BIC systems.
Bio-Inspired Computing with Emerging Devices: Bridging 2D Materials and Neuromorphic Architectures
M. Farronato;P. Mannocci;A. Milozzi;C. Monzio Compagnoni;D. Ielmini
2025-01-01
Abstract
In recent years, the rapid growth of data, internet connectivity, and digital technologies has driven an urgent need for processing systems that combine high performance with energy efficiency. Artificial intelligence (AI) has emerged as a powerful tool, leveraging deep learning models like convolutional neural networks (CNNs) and large language models (LLMs) to process vast data while adapting to dynamic environments. However, these models require significant hardware resources, especially for training, leading to unsustainable energy consumption [1]. Bio-inspired computing (BIC) presents a promising solution to these challenges by emulating biological brain functions to enable more efficient computation. Indeed, the human brain is capable of an extraordinary range of operations, such as recognizing images, understanding and generating language, coordinating complex body movements, making decisions, and learning from past experiences. What makes these abilities important is the brain's energy efficiency, which is much better with respect to modern computing architectures. Moreover, it achieves this through an exceptionally high degree of parallelism [2]. This remarkable efficiency stems from the colocalization of memory and processing within the brain, which is in contradiction with traditional computing architectures [3]. In biological systems, the fundamental building blocks are neurons and synapses: neurons are specialized cells that can sense, generate, propagate, and respond to electrical signals [4], while synapses serve as dynamic communication junctions between neurons. Together, they form vast, intricate networks that underpin all cognitive functions. Due to all these excellent characteristics, BIC has been successfully applied across various domains (Figure 1a). This growing interest has driven extensive research into innovative devices designed to replicate neural processing with enhanced efficiency. Devices such as resistive random-access memories (RRAMs), phase change memories (PCMs), three-dimensional crosspoint (3DXP) memories, and electrochemical random-access memories (ECRAMs) are increasingly recognized as promising candidates for brain-inspired computing (BIC) applications. More recently, devices based on two-dimensional (2D) materials have garnered significant attention due to their exceptional potential for device scaling and integration into advanced electronic architectures. Indeed, 2D materials have intrinsic atomic-scale thickness, remarkable electronic properties and cost-effective integration schemes, which make them perfect candidate as active materials for artificial neuron and synapses. In parallel with the advancement of device technology, the introduction of neuromorphic engineering has conceptualized BIC architectures. Inspired by biological sensory processing, a variety of approaches have emerged-including artificial neural networks (ANNs), spiking neural networks (SNNs), and the development of biomimetic systems such as silicon retinae, silicon cochleae, and other in-sensor computing platforms. Recently, the paradigm of reservoir computing has emerged as a promising approach for lowpower and real-time signal processing. The brain-inspired nature of this approach makes it ideal for application in biomedical fields, such as seizure detection (Figure 1b) [5]. Despite significant progress, the development of BIC continues to face several critical challenges. These span across technological limitations, architectural design complexities, and the translation of concepts into practical, real-world applications. Each of these areas remains a focus of active research and innovation, with ongoing efforts to overcome the barriers and unlock the full potential of BIC systems.| File | Dimensione | Formato | |
|---|---|---|---|
|
2025_isvlsi.pdf
Accesso riservato
:
Publisher’s version
Dimensione
319.03 kB
Formato
Adobe PDF
|
319.03 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


