A review on the methods to evaluate crowd contributions in crowdsourcing applications
Computation theory; Intelligent computing; Surveying; Automated evaluation; Evaluation method; Evaluation methods; Expert judgement; Grounded theory; New evaluation methods; Systematic Review; Crowdsourcing
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Conference Paper |
Published: |
Springer
2023
|
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my.uniten.dspace-25787 |
---|---|
record_format |
dspace |
spelling |
my.uniten.dspace-257872023-05-29T16:14:17Z A review on the methods to evaluate crowd contributions in crowdsourcing applications Aris H. Azizan A. 13608397500 57213355273 Computation theory; Intelligent computing; Surveying; Automated evaluation; Evaluation method; Evaluation methods; Expert judgement; Grounded theory; New evaluation methods; Systematic Review; Crowdsourcing Due to the nature of crowdsourcing that openly accepts contributions from the crowd, the need to evaluate the contributions is obvious to ensure their reliability. A number of evaluation methods are used in the existing crowdsourcing applications to evaluate these contributions. This study aims to identify and document these methods. To do this, 50 crowdsourcing applications obtained from an extensive literature and online search were reviewed. Analysis performed on the applications found that depending on the types of crowdsourcing applications, whether simple, complex or creative, three different methods are being used. These are expert judgement, rating and feedback. While expert judgement is mostly used in complex and creative crowdsourcing initiatives, rating is widely used in simple ones. This paper is the only reference known so far that documents the current state of the evaluation methods in the existing crowdsourcing applications. It would be useful in determining the way forward for research in the area, such as designing a new evaluation method. It also justifies the need for automated evaluation method for crowdsourced contributions. � Springer Nature Switzerland AG 2020. Final 2023-05-29T08:14:17Z 2023-05-29T08:14:17Z 2020 Conference Paper 10.1007/978-3-030-33582-3_97 2-s2.0-85077774650 https://www.scopus.com/inward/record.uri?eid=2-s2.0-85077774650&doi=10.1007%2f978-3-030-33582-3_97&partnerID=40&md5=3fb159fa6fe6068074b79fc784465406 https://irepository.uniten.edu.my/handle/123456789/25787 1073 1031 1041 Springer Scopus |
institution |
Universiti Tenaga Nasional |
building |
UNITEN Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Tenaga Nasional |
content_source |
UNITEN Institutional Repository |
url_provider |
http://dspace.uniten.edu.my/ |
description |
Computation theory; Intelligent computing; Surveying; Automated evaluation; Evaluation method; Evaluation methods; Expert judgement; Grounded theory; New evaluation methods; Systematic Review; Crowdsourcing |
author2 |
13608397500 |
author_facet |
13608397500 Aris H. Azizan A. |
format |
Conference Paper |
author |
Aris H. Azizan A. |
spellingShingle |
Aris H. Azizan A. A review on the methods to evaluate crowd contributions in crowdsourcing applications |
author_sort |
Aris H. |
title |
A review on the methods to evaluate crowd contributions in crowdsourcing applications |
title_short |
A review on the methods to evaluate crowd contributions in crowdsourcing applications |
title_full |
A review on the methods to evaluate crowd contributions in crowdsourcing applications |
title_fullStr |
A review on the methods to evaluate crowd contributions in crowdsourcing applications |
title_full_unstemmed |
A review on the methods to evaluate crowd contributions in crowdsourcing applications |
title_sort |
review on the methods to evaluate crowd contributions in crowdsourcing applications |
publisher |
Springer |
publishDate |
2023 |
_version_ |
1806427962770194432 |
score |
13.211869 |