DOI

The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks. We designed and performed an annotation task on a print collection of the Rijksmuseum Amsterdam, involving experts and crowd workers in the domain-specific description of depicted ow- ers. We created a testbed to collect annotations from ower experts and crowd workers and analyzed these in regard to user agreement. The findings show promising results, demonstrating how, for given categories, nichesourcing can provide useful annotations by connecting crowdsourcing to domain expertise.

Original languageEnglish
Title of host publicationWWW 2014 Companion - Proceedings of the 23rd International Conference on World Wide Web
PublisherAssociation for Computing Machinery (ACM)
Pages567-568
Number of pages2
ISBN (Electronic)9781450327459
DOIs
StatePublished - 7 Apr 2014
Event23rd International Conference on World Wide Web, WWW 2014 - Seoul, Korea, Republic of

Conference

Conference23rd International Conference on World Wide Web, WWW 2014
CountryKorea, Republic of
CitySeoul
Period7/04/1411/04/14

    Research areas

  • Crowdsourcing, Cultural heritage, Knowledge intensive tasks, Nichesourcing, Tagging

ID: 15548722