TY - JOUR
T1 - Gamers, citizen scientists, and data
T2 - Exploring participant contributions in two games with a purpose
AU - Prestopnik, Nathan
AU - Crowston, Kevin
AU - Wang, Jun
N1 - Publisher Copyright:
© 2016 Elsevier Ltd
PY - 2017/3/1
Y1 - 2017/3/1
N2 - Two key problems for crowd-sourcing systems are motivating contributions from participants and ensuring the quality of these contributions. Games have been suggested as a motivational approach to encourage contribution, but attracting participation through game play rather than intrinsic interest raises concerns about the quality of the contributions provided. These concerns are particularly important in the context of citizen science projects, when the contributions are data to be used for scientific research. To assess the validity of concerns about the effects of gaming on data quality, we compare the quality of data obtained from two citizen science games, one a “gamified” version of a species classification task and one a fantasy game that used the classification task only as a way to advance in the game play. Surprisingly, though we did observe cheating in the fantasy game, data quality (i.e., classification accuracy) from participants in the two games was not significantly different. As well, data from short-time contributors was also at a usable level of accuracy. Finally, learning did not seem to affect data quality in our context. These findings suggest that various approaches to gamification can be useful for motivating contributions to citizen science projects.
AB - Two key problems for crowd-sourcing systems are motivating contributions from participants and ensuring the quality of these contributions. Games have been suggested as a motivational approach to encourage contribution, but attracting participation through game play rather than intrinsic interest raises concerns about the quality of the contributions provided. These concerns are particularly important in the context of citizen science projects, when the contributions are data to be used for scientific research. To assess the validity of concerns about the effects of gaming on data quality, we compare the quality of data obtained from two citizen science games, one a “gamified” version of a species classification task and one a fantasy game that used the classification task only as a way to advance in the game play. Surprisingly, though we did observe cheating in the fantasy game, data quality (i.e., classification accuracy) from participants in the two games was not significantly different. As well, data from short-time contributors was also at a usable level of accuracy. Finally, learning did not seem to affect data quality in our context. These findings suggest that various approaches to gamification can be useful for motivating contributions to citizen science projects.
KW - Citizen science
KW - Crowdsourcing
KW - Data quality
KW - Games
KW - Gamification
UR - http://www.scopus.com/inward/record.url?scp=84998537728&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84998537728&partnerID=8YFLogxK
U2 - 10.1016/j.chb.2016.11.035
DO - 10.1016/j.chb.2016.11.035
M3 - Article
AN - SCOPUS:84998537728
SN - 0747-5632
VL - 68
SP - 254
EP - 268
JO - Computers in Human Behavior
JF - Computers in Human Behavior
ER -