TY - GEN
T1 - Beyond the Medium
T2 - Rethinking Information Literacy through Crowdsourced Analysis
AU - Boichak, Olga
AU - Canzonetta, Jordan
AU - Sitaula, Niraj
AU - McKernan, Brian
AU - Taylor, Sarah
AU - Rossini, Patricia G.C.
AU - Clegg, B. A.
AU - K, Kenski
AU - Martey, Rosa
AU - McCracken, Nancy
AU - Oesterlund, Carsten
AU - Myers, Roc
AU - Folkestad, James E.
AU - Stromer-Galley, Jennifer
PY - 2019
Y1 - 2019
N2 - Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.
AB - Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.
U2 - 10.24251/HICSS.2019.051
DO - 10.24251/HICSS.2019.051
M3 - Conference contribution
BT - Proceedings of the 51st Hawaii International Conference on System Sciences
ER -