Toxicity Detection in Multiplayer Online Games

Marcus Märtens, Siqi Shen, Alex Iosup, Fernando Kuipers

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

15 Citations (Scopus)
1553 Downloads (Pure)

Abstract

Social interactions in multiplayer online games are an essential feature for a growing number of players world-wide. However, this interaction between the players might lead to the emergence of undesired and unintended behavior, particularly if the game is designed to be highly competitive. Communication channels might be abused to harass and verbally assault other players, which negates the very purpose of entertainment games by creating a toxic player-community. By using a novel natural language processing framework, we detect profanity in chat-logs of a popular Multiplayer Online Battle Arena (MOBA) game and develop a method to classify toxic remarks. We show how toxicity is non-trivially linked to game success.
Original languageEnglish
Title of host publication2015 International Workshop on Network and Systems Support for Games (NetGames)
Place of PublicationZagreb, Croatia
PublisherIEEE
Number of pages6
ISBN (Electronic)978-1-5090-0068-5
ISBN (Print)978-1-5090-0069-2
DOIs
Publication statusPublished - 2015
Event14th International Workshop on Network and Systems Support for Games -
Duration: 3 Dec 2015 → …

Conference

Conference14th International Workshop on Network and Systems Support for Games
Abbreviated titleNetGames 2015
Period3/12/15 → …

Fingerprint

Dive into the research topics of 'Toxicity Detection in Multiplayer Online Games'. Together they form a unique fingerprint.

Cite this