Crowdsourcing Truthfulness: The Impact of Judgment Scale and Assessor Bias

  • David La Barbera
  • Kevin Roitero
  • Damiano Spina
  • Stefano Mizzaro
  • Gianluca Demartini
Proceedings of ECIR'20, 2020

News content can sometimes be misleading and influence users’ decision making processes (e.g., voting decisions). Quantitatively assessing the truthfulness of content becomes key, but it is often challenging and thus done by experts. In this work we look at how experts and non-expert assess truthfulness of content by focusing on the effect of the adopted judgment scale and of assessors’ own bias on the judgments they perform. Our results indicate a clear effect of the assessors’ political background on their judgments where they tend to trust content which is aligned to their own belief, even if experts have marked it as false. Crowd assessors also seem to have a preference towards coarse-grained scales, as they tend to use a few extreme values rather than the full breadth of fine-grained scales.

@inproceedings{labarbera2020crowdsourcing, 
   title={{Crowdsourcing Truthfulness: The Impact of Judgment Scale and Assessor Bias}},
   booktitle={Proceedings of ECIR'20},
   author={{La Barbera}, David and Roitero, Kevin and Demartini, Gianluca and Mizzaro, Stefano and Spina, Damiano},
   year={2020}
}
Damiano Spina