Federated learning (FL) is emerging as a new paradigm for training a machine learning model in cooperative networks. The model parameters are optimized collectively by large populations of interconnected devices, acting as cooperative learners that exchange local model updates with the server, rather than user data. The FL framework is however centralized, as it relies on the server for fusion of the model updates and as such it is limited by a single point of failure. In this paper we propose a distributed FL approach that performs a decentralized fusion of local model parameters by leveraging mutual cooperation between the devices and local (in-network) data operations via consensus-based methods. Communication with the server can be partially, or fully, replaced by in-network operations, scaling down the traffic load on the server as well as paving the way towards a fully serverless FL approach. This proposal also lays the groundwork for integration of FL methods within future (beyond 5G) wireless networks characterized by distributed and decentralized connectivity. The proposed algorithms are implemented and published as open source. They are also designed and verified by experimental data.

Federated Learning with Mutually Cooperating Devices: A Consensus Approach Towards Server-Less Model Optimization

Savazzi S.;Nicoli M.;Rampa V.;Kianoush S.
2020-01-01

Abstract

Federated learning (FL) is emerging as a new paradigm for training a machine learning model in cooperative networks. The model parameters are optimized collectively by large populations of interconnected devices, acting as cooperative learners that exchange local model updates with the server, rather than user data. The FL framework is however centralized, as it relies on the server for fusion of the model updates and as such it is limited by a single point of failure. In this paper we propose a distributed FL approach that performs a decentralized fusion of local model parameters by leveraging mutual cooperation between the devices and local (in-network) data operations via consensus-based methods. Communication with the server can be partially, or fully, replaced by in-network operations, scaling down the traffic load on the server as well as paving the way towards a fully serverless FL approach. This proposal also lays the groundwork for integration of FL methods within future (beyond 5G) wireless networks characterized by distributed and decentralized connectivity. The proposed algorithms are implemented and published as open source. They are also designed and verified by experimental data.
2020
ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
978-1-5090-6631-5
consensus
cooperative networks
distributed intelligence
Federated learning
machine learning
File in questo prodotto:
File Dimensione Formato  
CV_2020_ICASSP_FL.pdf

Accesso riservato

Descrizione: Publisher's version full paper
: Publisher’s version
Dimensione 635.76 kB
Formato Adobe PDF
635.76 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1145016
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 6
social impact