Documents

  • short2

    Final published version, 901 KB, PDF-document

Links

Past research has demonstrated that removing implicit gender information from the user-item matrix does not result in substantial performance losses. Such results point towards promising solutions for protecting users’ privacy without compromising prediction performance, which are of particular interest in multistakeholder environments. Here, we investigate BlurMe, a gender obfuscation technique that has been shown to block classifiers from inferring binary gender from users’ profiles. We first point out a serious shortcoming of BlurMe: Simple data visualizations can reveal that BlurMe has been applied to a data set, including which items have been impacted. We then propose an extension to BlurMe, called BlurM(or)e, that addresses this issue. We reproduce the original BlurMe experiments with the MovieLens data set, and point out the relative advantages of BlurM(or)e.

Original languageEnglish
Title of host publicationRMSE 2019 Workshop on Recommendation in Multi-stakeholder Environments
Subtitle of host publicationProceedings of the Workshop on Recommendation in Multi-stakeholder Environments co-located with the 13th ACM Conference on Recommender Systems (RecSys 2019
EditorsRobin Burke, Himan Abdollahpouri, Edward Malthouse, KP Thai, Yongfeny Zhang
PublisherCEUR-WS.org
Pages1-5
Number of pages5
Publication statusPublished - 2019
Event2019 Workshop on Recommendation in Multi-Stakeholder Environments, RMSE 2019 - Copenhagen, Denmark
Duration: 20 Sep 201920 Sep 2019

Publication series

NameCEUR Workshop Proceedings
Volume2440
ISSN (Print)1613-0073

Conference

Conference2019 Workshop on Recommendation in Multi-Stakeholder Environments, RMSE 2019
CountryDenmark
CityCopenhagen
Period20/09/1920/09/19

    Research areas

  • Data Obfuscation, Privacy, Recommender Systems

ID: 68758821