DOI

  • Thomas Röggla
  • Najereh Shirzadian
  • Zhiyuan Zheng
  • Alice Panza
  • Pablo Cesar

This demo showcases a real-time visualisation displaying the level of engagement of a group of people attending a Jazz concert. Based on wearable sensor technology and machine learning principles, we present how this visualisation for enhancing events was developed following a user-centric approach. We describe the process of running an experiment using our custom physiological sensor platform, gathering requirements for the visualisation and finally implementing said visualisation. The end result being a collaborative artwork to enhance people's immersion into cultural events.

Original languageEnglish
Title of host publicationMM 2017 - Proceedings of the 2017 ACM Multimedia Conference
PublisherAssociation for Computing Machinery (ACM)
Pages1239-1240
Number of pages2
ISBN (Electronic)9781450349062
DOIs
Publication statusPublished - 23 Oct 2017
Event25th ACM International Conference on Multimedia, MM 2017 - Mountain View, United States
Duration: 23 Oct 201727 Oct 2017

Conference

Conference25th ACM International Conference on Multimedia, MM 2017
CountryUnited States
CityMountain View
Period23/10/1727/10/17

    Research areas

  • Cultural experiences, Data visualisation, GSR, Interactive art, Sensors, Shared experiences

ID: 44889276