This paper introduces a new system capable of adaptively managing multimedia contents (e.g. music, video clips, etc.) and lighting scenarios based on the detected user's emotional state. The system captures the emotion from the user's face expression mapping it into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. Results of preliminary tests suggest that the proposed system is able to detect the user's emotional state and manage proper music and light colors in a symbiotic way
An Adaptive System to Manage Playlists and Lighting Scenarios Based on the User's Emotions
Generosi A.;Ceccacci S.;
2019-01-01
Abstract
This paper introduces a new system capable of adaptively managing multimedia contents (e.g. music, video clips, etc.) and lighting scenarios based on the detected user's emotional state. The system captures the emotion from the user's face expression mapping it into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. Results of preliminary tests suggest that the proposed system is able to detect the user's emotional state and manage proper music and light colors in a symbiotic wayFile in questo prodotto:
File | Dimensione | Formato | |
---|---|---|---|
An_Adaptive_System_to_Manage_Playlists_and_Lighting_Scenarios_Based_on_the_Users_Emotions.pdf
accesso aperto
Tipologia:
Documento in post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza:
Copyright dell'editore
Dimensione
227.47 kB
Formato
Adobe PDF
|
227.47 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.