Standard

The MatchNMingle dataset : A novel multi-sensor resource for the analysis of social interactions and group dynamics in-the-wild during free-standing conversations and speed dates. / Cabrera-Quiros, Laura; Demetriou, Andrew; Gedik, Ekin; van der Meij, Leander; Hung, Hayley.

In: IEEE Transactions on Affective Computing, Vol. PP, No. 99, 2018, p. 1-17.

Research output: Contribution to journalArticleScientificpeer-review

Harvard

APA

Vancouver

Author

BibTeX

@article{ee5acc280b8e4ec7944a088e693a2065,
title = "The MatchNMingle dataset: A novel multi-sensor resource for the analysis of social interactions and group dynamics in-the-wild during free-standing conversations and speed dates",
abstract = "We present MatchNMingle, a novel multimodal/multisensor dataset for the analysis of free-standing conversational groups and speed-dates in-the-wild. MatchNMingle leverages the use of wearable devices and overhead cameras to record social interactions of 92 people during real-life speed-dates, followed by a cocktail party. To our knowledge, MatchNMingle has the largest number of participants, longest recording time and largest set of manual annotations for social actions available in this context in a real-life scenario. It consists of 2 hours of data from wearable acceleration, binary proximity, video, audio, personality surveys, frontal pictures and speed-date responses. Participants' positions and group formations were manually annotated; as were social actions (eg. speaking, hand gesture) for 30 minutes at 20fps making it the first dataset to incorporate the annotation of such cues in this context. We present an empirical analysis of the performance of crowdsourcing workers against trained annotators in simple and complex annotation tasks, founding that although efficient for simple tasks, using crowdsourcing workers for more complex tasks like social action annotation led to additional overhead and poor inter-annotator agreement compared to trained annotators (differences up to 0.4 in Fleiss' Kappa coefficients). We also provide example experiments of how MatchNMingle can be used.",
keywords = "Acceleration, Cameras, cameras, Computers, Crowdsourcing, f-formation, Manuals, mingle, Multimodal dataset, personality traits, Sensors, Speed-dates, Task analysis, wearable acceleration",
author = "Laura Cabrera-Quiros and Andrew Demetriou and Ekin Gedik and {van der Meij}, Leander and Hayley Hung",
year = "2018",
doi = "10.1109/TAFFC.2018.2848914",
language = "English",
volume = "PP",
pages = "1--17",
journal = "IEEE Transactions on Affective Computing",
issn = "1949-3045",
publisher = "Institute of Electrical and Electronics Engineers (IEEE)",
number = "99",

}

RIS

TY - JOUR

T1 - The MatchNMingle dataset

T2 - A novel multi-sensor resource for the analysis of social interactions and group dynamics in-the-wild during free-standing conversations and speed dates

AU - Cabrera-Quiros, Laura

AU - Demetriou, Andrew

AU - Gedik, Ekin

AU - van der Meij, Leander

AU - Hung, Hayley

PY - 2018

Y1 - 2018

N2 - We present MatchNMingle, a novel multimodal/multisensor dataset for the analysis of free-standing conversational groups and speed-dates in-the-wild. MatchNMingle leverages the use of wearable devices and overhead cameras to record social interactions of 92 people during real-life speed-dates, followed by a cocktail party. To our knowledge, MatchNMingle has the largest number of participants, longest recording time and largest set of manual annotations for social actions available in this context in a real-life scenario. It consists of 2 hours of data from wearable acceleration, binary proximity, video, audio, personality surveys, frontal pictures and speed-date responses. Participants' positions and group formations were manually annotated; as were social actions (eg. speaking, hand gesture) for 30 minutes at 20fps making it the first dataset to incorporate the annotation of such cues in this context. We present an empirical analysis of the performance of crowdsourcing workers against trained annotators in simple and complex annotation tasks, founding that although efficient for simple tasks, using crowdsourcing workers for more complex tasks like social action annotation led to additional overhead and poor inter-annotator agreement compared to trained annotators (differences up to 0.4 in Fleiss' Kappa coefficients). We also provide example experiments of how MatchNMingle can be used.

AB - We present MatchNMingle, a novel multimodal/multisensor dataset for the analysis of free-standing conversational groups and speed-dates in-the-wild. MatchNMingle leverages the use of wearable devices and overhead cameras to record social interactions of 92 people during real-life speed-dates, followed by a cocktail party. To our knowledge, MatchNMingle has the largest number of participants, longest recording time and largest set of manual annotations for social actions available in this context in a real-life scenario. It consists of 2 hours of data from wearable acceleration, binary proximity, video, audio, personality surveys, frontal pictures and speed-date responses. Participants' positions and group formations were manually annotated; as were social actions (eg. speaking, hand gesture) for 30 minutes at 20fps making it the first dataset to incorporate the annotation of such cues in this context. We present an empirical analysis of the performance of crowdsourcing workers against trained annotators in simple and complex annotation tasks, founding that although efficient for simple tasks, using crowdsourcing workers for more complex tasks like social action annotation led to additional overhead and poor inter-annotator agreement compared to trained annotators (differences up to 0.4 in Fleiss' Kappa coefficients). We also provide example experiments of how MatchNMingle can be used.

KW - Acceleration

KW - Cameras

KW - cameras

KW - Computers

KW - Crowdsourcing

KW - f-formation

KW - Manuals

KW - mingle

KW - Multimodal dataset

KW - personality traits

KW - Sensors

KW - Speed-dates

KW - Task analysis

KW - wearable acceleration

UR - http://www.scopus.com/inward/record.url?scp=85049103438&partnerID=8YFLogxK

U2 - 10.1109/TAFFC.2018.2848914

DO - 10.1109/TAFFC.2018.2848914

M3 - Article

AN - SCOPUS:85049103438

VL - PP

SP - 1

EP - 17

JO - IEEE Transactions on Affective Computing

JF - IEEE Transactions on Affective Computing

SN - 1949-3045

IS - 99

ER -

ID: 45659506