Validation methodology for expert-annotated datasets: Event annotation case study

Oana Inel*, Lora Aroyo

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

2 Citations (Scopus)
96 Downloads (Pure)

Abstract

Event detection is still a difficult task due to the complexity and the ambiguity of such entities. On the one hand, we observe a low inter-annotator agreement among experts when annotating events, disregarding the multitude of existing annotation guidelines and their numerous revisions. On the other hand, event extraction systems have a lower measured performance in terms of F1-score compared to other types of entities such as people or locations. In this paper we study the consistency and completeness of expert-annotated datasets for events and time expressions. We propose a data-agnostic validation methodology of such datasets in terms of consistency and completeness. Furthermore, we combine the power of crowds and machines to correct and extend expert-annotated datasets of events. We show the benefit of using crowd-annotated events to train and evaluate a state-of-the-art event extraction system. Our results show that the crowd-annotated events increase the performance of the system by at least 5.3%.

Original languageEnglish
Title of host publication2nd Conference on Language, Data and Knowledge, LDK 2019
EditorsGerard de Melo, Bettina Klimek, Christian Fath, Paul Buitelaar, Milan Dojchinovski, Maria Eskevich, John P. McCrae, Christian Chiarcos
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
Pages1-15
Number of pages15
Volume70
ISBN (Electronic)9783959771054
DOIs
Publication statusPublished - 1 May 2019
Event2nd Conference on Language, Data and Knowledge, LDK 2019 - Leipzig, Germany
Duration: 20 May 201923 May 2019

Conference

Conference2nd Conference on Language, Data and Knowledge, LDK 2019
Country/TerritoryGermany
CityLeipzig
Period20/05/1923/05/19

Keywords

  • Crowdsourcing
  • Event extraction
  • Human-in-the-loop
  • Time extraction

Fingerprint

Dive into the research topics of 'Validation methodology for expert-annotated datasets: Event annotation case study'. Together they form a unique fingerprint.

Cite this