Abstract
This demo showcases a real-time visualisation displaying the level of engagement of a group of people attending a Jazz concert. Based on wearable sensor technology and machine learning principles, we present how this visualisation for enhancing events was developed following a user-centric approach. We describe the process of running an experiment using our custom physiological sensor platform, gathering requirements for the visualisation and finally implementing said visualisation. The end result being a collaborative artwork to enhance people's immersion into cultural events.
Original language | English |
---|---|
Title of host publication | MM 2017 - Proceedings of the 2017 ACM Multimedia Conference |
Publisher | Association for Computing Machinery (ACM) |
Pages | 1239-1240 |
Number of pages | 2 |
ISBN (Electronic) | 9781450349062 |
DOIs | |
Publication status | Published - 23 Oct 2017 |
Event | 25th ACM International Conference on Multimedia, MM 2017 - Mountain View, United States Duration: 23 Oct 2017 → 27 Oct 2017 |
Conference
Conference | 25th ACM International Conference on Multimedia, MM 2017 |
---|---|
Country/Territory | United States |
City | Mountain View |
Period | 23/10/17 → 27/10/17 |
Keywords
- Cultural experiences
- Data visualisation
- GSR
- Interactive art
- Sensors
- Shared experiences