On-device learning is one of the most relevant and challenging tasks in the field of tiny machine learning. Indeed, most state-of-the-art solutions focus on on-device inference, with only a few works dealing with the incremental fine-tuning of pre-trained machine and deep learning models. An interesting approach to enable multiple devices to collaboratively train a shared model is represented by Federated Learning (FL). Nevertheless, state-of-the-art floating-point FL solutions are tailored for edge devices (e.g., smartphones and tablets), and are therefore not suitable for running on devices with limited resources in terms of memory, computation, and energy. To date, only one work in the literature has proposed a tiny integer-based federated learning algorithm for training Multi-Layer Perceptrons (MLPs). In contrast to the existing literature, we introduce, for the first time, an integer-based FL algorithm capable of training integer Convolutional Neural Networks (CNNs) specifically designed to operate on resource-constrained devices. More in detail, the proposed algorithm is based on integer local error signals, enabling each device to train either the whole CNN or only a portion of it on its own data according to its resource availability. Experimental results on multi-class image classification benchmarks demonstrate the effectiveness of the proposed solution, showing an accuracy improvement compared to the state-of-the-art FL algorithm for integer MLP architectures, and the capability of training integer CNN architectures with minimal accuracy degradation compared to traditional floating-point FL solutions. The proposed algorithm is made available to the scientific community as a public repository.

Federated On-Device Learning of Integer-Based Convolutional Neural Networks

Colombo, Luca
2024-01-01

Abstract

On-device learning is one of the most relevant and challenging tasks in the field of tiny machine learning. Indeed, most state-of-the-art solutions focus on on-device inference, with only a few works dealing with the incremental fine-tuning of pre-trained machine and deep learning models. An interesting approach to enable multiple devices to collaboratively train a shared model is represented by Federated Learning (FL). Nevertheless, state-of-the-art floating-point FL solutions are tailored for edge devices (e.g., smartphones and tablets), and are therefore not suitable for running on devices with limited resources in terms of memory, computation, and energy. To date, only one work in the literature has proposed a tiny integer-based federated learning algorithm for training Multi-Layer Perceptrons (MLPs). In contrast to the existing literature, we introduce, for the first time, an integer-based FL algorithm capable of training integer Convolutional Neural Networks (CNNs) specifically designed to operate on resource-constrained devices. More in detail, the proposed algorithm is based on integer local error signals, enabling each device to train either the whole CNN or only a portion of it on its own data according to its resource availability. Experimental results on multi-class image classification benchmarks demonstrate the effectiveness of the proposed solution, showing an accuracy improvement compared to the state-of-the-art FL algorithm for integer MLP architectures, and the capability of training integer CNN architectures with minimal accuracy degradation compared to traditional floating-point FL solutions. The proposed algorithm is made available to the scientific community as a public repository.
2024
AIMLSystems '24: Proceedings of the 4th International Conference on AI-ML Systems
Federated learning
Tiny machine learning
Local error signals
Deep learning
Integer-only training
File in questo prodotto:
File Dimensione Formato  
3703412.3703423.pdf

accesso aperto

Descrizione: Paper
: Publisher’s version
Dimensione 886.11 kB
Formato Adobe PDF
886.11 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1286131
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact