A review on the methods to evaluate crowd contributions in crowdsourcing applications

Hazleen Aris, Aqilah Azizan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Due to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions.

Original languageEnglish
Title of host publicationEmerging Trends in Intelligent Computing and Informatics - Data Science, Intelligent Information Systems and Smart Computing
EditorsFaisal Saeed, Fathey Mohammed, Nadhmi Gazem
PublisherSpringer
Pages1031-1041
Number of pages11
ISBN (Print)9783030335816
DOIs
Publication statusPublished - 01 Jan 2020
Event4th International Conference of Reliable Information and Communication Technology, IRICT 2019 - Johor Bahru, Malaysia
Duration: 22 Sep 201923 Sep 2019

Publication series

NameAdvances in Intelligent Systems and Computing
Volume1073
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

Conference4th International Conference of Reliable Information and Communication Technology, IRICT 2019
CountryMalaysia
CityJohor Bahru
Period22/09/1923/09/19

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Computer Science(all)

Fingerprint Dive into the research topics of 'A review on the methods to evaluate crowd contributions in crowdsourcing applications'. Together they form a unique fingerprint.

  • Cite this

    Aris, H., & Azizan, A. (2020). A review on the methods to evaluate crowd contributions in crowdsourcing applications. In F. Saeed, F. Mohammed, & N. Gazem (Eds.), Emerging Trends in Intelligent Computing and Informatics - Data Science, Intelligent Information Systems and Smart Computing (pp. 1031-1041). (Advances in Intelligent Systems and Computing; Vol. 1073). Springer. https://doi.org/10.1007/978-3-030-33582-3_97