Artificial Intelligence on the edge is a matter of great importance towards the enhancement of smart devices that rely on operations with real-time constraints. We present PolimiDL, a framework for the acceleration of Deep Learning on mobile and embedded systems with limited resources and heterogeneous architectures. Experimental results show competitive results with respect to TensorFlow Lite for the execution of small models.

Accelerating Deep Learning Inference on Mobile Systems

FRAJBERG, DARIAN;Carlo Bernaschina;Piero Fraternali
2019-01-01

Abstract

Artificial Intelligence on the edge is a matter of great importance towards the enhancement of smart devices that rely on operations with real-time constraints. We present PolimiDL, a framework for the acceleration of Deep Learning on mobile and embedded systems with limited resources and heterogeneous architectures. Experimental results show competitive results with respect to TensorFlow Lite for the execution of small models.
2019
Artificial Intelligence and Mobile Services – AIMS 2019
Deep Learning; Mobile sensing; Acceleration; Mobile devices; Embedded systems; Continuous vision
File in questo prodotto:
File Dimensione Formato  
Accelerating_Deep_Learning_inference_on_mobile_systems.pdf

accesso aperto

: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 282.01 kB
Formato Adobe PDF
282.01 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1090919
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact