Documents

  • 35744048

    Final published version, 869 KB, PDF-document

DOI

Recommender System research has evolved to focus on developing algorithms capable of high performance in online systems. This development calls for a new evaluation infrastructure that supports multi-dimensional evaluation of recommender systems. Today’s researchers should analyze algorithms with respect to a variety of aspects including predictive performance and scalability. Researchers need to subject algorithms to realistic conditions in online A/B tests.
We introduce two resources supporting such evaluation methodologies: the new data set of stream recommendation interactions released for CLEF NewsREEL 2017, and the new Open Recommendation Platform (ORP). The data set allows researchers to study a stream recommendation problem closely by “replaying” it locally, and ORP makes it possible to take this evaluation “live” in a living
lab scenario. Specifically, ORP allows researchers to deploy their algorithms in a live stream to carry out A/B tests. To our knowledge, NewsREEL is the first online news recommender system resource to be put at the disposal of the research community. In order to encourage others to develop comparable resources for a wide range of domains, we present a list of practical lessons learned in the
development of the dataset and ORP.
Original languageEnglish
Title of host publicationProceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2017
Place of PublicationNew York, NY
PublisherAssociation for Computing Machinery (ACM)
Pages1257-1260
Number of pages4
ISBN (Electronic)978-1-4503-5022-8
DOIs
StatePublished - 2017
EventSIGIR 2017 - Tokyo, Japan

Conference

ConferenceSIGIR 2017
CountryJapan
CityTokyo
Period7/08/1711/08/17
Internet address

    Research areas

  • streams, recommender system, multi--dimensional benchmarking

ID: 35744048