Brain-computer interfaces (BCIs) are systems initially designed to compensate for motor disabilities affecting people whose control of the muscular system is compromised. However, recent developments open the BCIs market to a wide range of medical and non-medical applications. This raises the need for systems capable of interpreting more and more stimuli, even from different sensory domains. In this work, we design a machine-learning system able to fit both application domains accurately recognizing visual and auditory stimuli starting from the event-related potentials (ERPs) they generate. The obtained results are promising and some practical and realization aspects are discussed.

Automatic stimuli classification from ERP data for augmented communication via Brain-Computer Interfaces

Leoni J.;Tanelli M.;Strada S. C.;
2020

Abstract

Brain-computer interfaces (BCIs) are systems initially designed to compensate for motor disabilities affecting people whose control of the muscular system is compromised. However, recent developments open the BCIs market to a wide range of medical and non-medical applications. This raises the need for systems capable of interpreting more and more stimuli, even from different sensory domains. In this work, we design a machine-learning system able to fit both application domains accurately recognizing visual and auditory stimuli starting from the event-related potentials (ERPs) they generate. The obtained results are promising and some practical and realization aspects are discussed.
Proceedings of the 2020 IEEE International Conference on Human-Machine Systems, ICHMS 2020
978-1-7281-5871-6
Boosted Trees
Event-related Potentials
Machine-Learning
Neural Networks
time-Series Classification
File in questo prodotto:
File Dimensione Formato  
ICHMS-ERP.pdf

Accesso riservato

Descrizione: Articolo versione finale accepted
: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 1.25 MB
Formato Adobe PDF
1.25 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1169206
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact