In a world full of sensors that gather personal data and digital solutions that use these data to provide feedback and personalized experiences, biofeedback is increasingly involved in the definition of new paradigms for tailoring interactions. Companies are collecting and using personal data to propose personalized services. Content providers are pushing users to produce data in order to create personalized storytelling experiences. In this context, the tech market is offering new low-cost solutions able to gather biodata. The paper reports the results of evidence-based explorations aimed at formalizing knowledge regarding the use of passive and unconscious interaction to control the fruition of storytelling artifacts. We investigate a new interaction paradigm that promise to seamlessly enable unconscious and enactive interactions for movie experiences. We propose the use of emotion recognition and eye-tracking as exploratory technologies that promise to be a potential contribution to richer access to the spectators’ emotional involvement. We reflect on disruptive power of non-invasive technologies, given by the possibility to be used for home-cinema experiences. Investigating on emotional states of users in their decision we leverage on the emotive-cognitive data as a matter of creation and enabling of tailored movie experiences. Our research intends to explore the possibility of extracting knowledge from recognition of facial expressions that will contribute to foster its use in real-time passive interaction using emotion recognition as a trigger of enactivity that is not limited to interactive storytelling but opens new scenarios in the design of proactive systems for screens, spaces and environments. Furthermore, we provide suggestions as guidelines for the design of enactive experiences that leverage on emotion recognition and eye-tracking. © 2020, Springer Nature Switzerland AG.

Designing Unconscious and Enactive Interaction for Interactive Movie Experience

Laura Varisco;Giulio Interlandi
2020-01-01

Abstract

In a world full of sensors that gather personal data and digital solutions that use these data to provide feedback and personalized experiences, biofeedback is increasingly involved in the definition of new paradigms for tailoring interactions. Companies are collecting and using personal data to propose personalized services. Content providers are pushing users to produce data in order to create personalized storytelling experiences. In this context, the tech market is offering new low-cost solutions able to gather biodata. The paper reports the results of evidence-based explorations aimed at formalizing knowledge regarding the use of passive and unconscious interaction to control the fruition of storytelling artifacts. We investigate a new interaction paradigm that promise to seamlessly enable unconscious and enactive interactions for movie experiences. We propose the use of emotion recognition and eye-tracking as exploratory technologies that promise to be a potential contribution to richer access to the spectators’ emotional involvement. We reflect on disruptive power of non-invasive technologies, given by the possibility to be used for home-cinema experiences. Investigating on emotional states of users in their decision we leverage on the emotive-cognitive data as a matter of creation and enabling of tailored movie experiences. Our research intends to explore the possibility of extracting knowledge from recognition of facial expressions that will contribute to foster its use in real-time passive interaction using emotion recognition as a trigger of enactivity that is not limited to interactive storytelling but opens new scenarios in the design of proactive systems for screens, spaces and environments. Furthermore, we provide suggestions as guidelines for the design of enactive experiences that leverage on emotion recognition and eye-tracking. © 2020, Springer Nature Switzerland AG.
2020
8th International Conference on Distributed, Ambient and Pervasive Interactions, DAPI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020
978-303050343-7
Real-time interaction - Enactive interaction - Eye-controlled interfaces - Emotion recognition - Evidence-based design research
File in questo prodotto:
File Dimensione Formato  
Varisco_interlandi_camera ready.pdf

Accesso riservato

Descrizione: Articolo principale
: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 2.04 MB
Formato Adobe PDF
2.04 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1155361
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact