A little competition never hurt anyone's relevance assessments

Yuan Jin, Mark J. Carman, Lexing Xie

    Research output: Contribution to journalConference articlepeer-review

    2 Citations (Scopus)

    Abstract

    This paper investigates the effect of real-time performance feedback and competition on the accuracy of crowd-workers for a document relevance assessment task. Through a series of controlled tests, we show that displaying a leaderboard to crowd-workers can motivate them to improve their performance, providing a bonus is offered to the best performing workers. This effect is observed even when test questions are used to enforce quality control during task completion.

    Original languageEnglish
    Pages (from-to)29-36
    Number of pages8
    JournalCEUR Workshop Proceedings
    Volume1642
    Publication statusPublished - 2016
    Event3rd International Workshop on Gamification for Information Retrieval, GamifIR 2016 - Pisa, Italy
    Duration: 21 Jul 2016 → …

    Fingerprint

    Dive into the research topics of 'A little competition never hurt anyone's relevance assessments'. Together they form a unique fingerprint.

    Cite this