The paper presents a new multimodal 3D education environment for children with autism. The new multimodal interaction system considers a combination of visual, voice, and textual modalities. In particular, it allows children with autism to access contents through easy iconic symbols designed to guide them into the innovative environment. For that purpose, it has been very important to consider and identify the classes and attributes necessary to correctly describe different users. In the architecture hierarchy three different user profiles have been considered and structured, following the ICF* model (an extension of the WHO International Classification of Functioning, Disability and Health guidelines), and describing both static and dynamic properties. A specific iconic language has been used to enrich and to present the virtual environment. Simultaneous visual, audio, and cognitive stimuli have been carefully used: they could be potential barriers but also rich opportunities for persons with autism. It has not been only a matter of putting information in a virtual space; it has been necessary to design and develop new languages, metaphors, and codes of interaction, in order to reduce the distance between the user and the system. In this case, communication talks via images, sounds, and gestures have been fundamental. The approach of the project takes into account the user model, the user profiles, the personalization, and the experimentation.

Multimodal Interaction for users with Autism in a 3D Educational Environment

TRIVILINI, ALESSANDRO;SBATTELLA, LICIA;TEDESCO, ROBERTO
2011-01-01

Abstract

The paper presents a new multimodal 3D education environment for children with autism. The new multimodal interaction system considers a combination of visual, voice, and textual modalities. In particular, it allows children with autism to access contents through easy iconic symbols designed to guide them into the innovative environment. For that purpose, it has been very important to consider and identify the classes and attributes necessary to correctly describe different users. In the architecture hierarchy three different user profiles have been considered and structured, following the ICF* model (an extension of the WHO International Classification of Functioning, Disability and Health guidelines), and describing both static and dynamic properties. A specific iconic language has been used to enrich and to present the virtual environment. Simultaneous visual, audio, and cognitive stimuli have been carefully used: they could be potential barriers but also rich opportunities for persons with autism. It has not been only a matter of putting information in a virtual space; it has been necessary to design and develop new languages, metaphors, and codes of interaction, in order to reduce the distance between the user and the system. In this case, communication talks via images, sounds, and gestures have been fundamental. The approach of the project takes into account the user model, the user profiles, the personalization, and the experimentation.
2011
IADIS International Conference Interfaces and Human Computer Interaction 2011, Part of the IADIS Multi Conference on Computer Science and Information Systems 2011, MCCSIS 2011
9789728939526
INF
File in questo prodotto:
File Dimensione Formato  
Multimodal interaction for users with autism.pdf

Accesso riservato

: Pre-Print (o Pre-Refereeing)
Dimensione 1.37 MB
Formato Adobe PDF
1.37 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/636847
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact