Researchers from Queen Mary University of London’s Centre for Digital Music have created a tool that allows real-time visualisations of the audience’s reaction during live music performances.
Mood Visualiser uses data like skin conductance – a measure of emotional arousal based on the amount of sweat secreted – and pulse to see whether the audience is calm or excited.
Created as part of a Media and Arts Technology project, student Qi Gong, commented: “The prototype has been developed using a single sensor but in the future, we plan to experiment with multiple sensors using wireless technology so we can monitor more than one audience member.”
“This project bridges a gap in traditional live music visualisations which do not take into account audience members' reactions to the music,” said project supervisor Mathieu Barthet and lecturer at the Centre for Digital Music based in the School of Electronic Engineering and Computer Science.
“We hope that this can foster a new type of creative interactions between audience and performers – one of the challenges is to map the sensor data to graphics in a meaningful way that’s accessible to all.”
The current system shows when the audience gets excited by generating some moving particles on the screen. The number of particles and their fluctuations relate to the skin conductance whilst their size relates to the pulse measurements.
The system uses an e-Health platform designed and distributed by Spanish company Cooking Hacks. The platform works with an Arduino framework - an open-source platform that allows the creation of interactive electronic objects – and links up remote access to the biometric data.
Mood Visualiser was demonstrated at a hack day in Barcelona during the Sonar Festival.
For media information, contact: