Abstract
The benefits of exploiting multi-modality in the analysis of human-human social behaviour has been demonstrated widely in the community. An important aspect of this problem is the collection of data-sets that provide a rich and realistic representation of how people actually socialize with each other in real life. These subtle coordination patterns are influenced by individual beliefs, goals, and, desires related to what an individual stands to lose or gain in the activities they perform in their every day life. These conditions cannot be easily replicated in a lab setting and require a radical re-thinking of both how and what to collect. This tutorial provides a guide on how to create such multi-modal multi-sensor data sets when holistically considering the entire experimental design and data collection process.
Original language | English |
---|---|
Title of host publication | MM 2019 - Proceedings of the 27th ACM International Conference on Multimedia |
Publisher | Association for Computing Machinery (ACM) |
Pages | 2714-2715 |
Number of pages | 2 |
ISBN (Electronic) | 9781450368896 |
DOIs | |
Publication status | Published - 15 Oct 2019 |
Event | 27th ACM International Conference on Multimedia, MM 2019 - Nice, France Duration: 21 Oct 2019 → 25 Oct 2019 |
Conference
Conference | 27th ACM International Conference on Multimedia, MM 2019 |
---|---|
Country/Territory | France |
City | Nice |
Period | 21/10/19 → 25/10/19 |
Keywords
- ConfLab
- Multimodal Synchronization
- Social Behaviour Analysis
- Wearable Sensors