Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis

Olga Boichak, Jordan Canzonetta, Niraj Sitaula, Brian McKernan, Sarah Taylor, Patricia G.C. Rossini, B. A. Clegg, Kenski K, Rosa Martey, Nancy McCracken, Carsten Oesterlund, Roc Myers, James E. Folkestad, Jennifer Stromer-Galley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.
Original languageEnglish (US)
Title of host publicationProceedings of the 51st Hawaii International Conference on System Sciences
Number of pages10
DOIs
StatePublished - 2019

Fingerprint

information medium
literacy
evaluation
source of information
credibility
worker
experiment
trend

Cite this

Boichak, O., Canzonetta, J., Sitaula, N., McKernan, B., Taylor, S., Rossini, P. G. C., ... Stromer-Galley, J. (2019). Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis. In Proceedings of the 51st Hawaii International Conference on System Sciences https://doi.org/10.24251/HICSS.2019.051

Beyond the Medium : Rethinking Information Literacy through Crowdsourced Analysis. / Boichak, Olga; Canzonetta, Jordan; Sitaula, Niraj; McKernan, Brian; Taylor, Sarah; Rossini, Patricia G.C.; Clegg, B. A.; K, Kenski; Martey, Rosa; McCracken, Nancy; Oesterlund, Carsten; Myers, Roc; Folkestad, James E.; Stromer-Galley, Jennifer.

Proceedings of the 51st Hawaii International Conference on System Sciences. 2019.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Boichak, O, Canzonetta, J, Sitaula, N, McKernan, B, Taylor, S, Rossini, PGC, Clegg, BA, K, K, Martey, R, McCracken, N, Oesterlund, C, Myers, R, Folkestad, JE & Stromer-Galley, J 2019, Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis. in Proceedings of the 51st Hawaii International Conference on System Sciences. https://doi.org/10.24251/HICSS.2019.051
Boichak O, Canzonetta J, Sitaula N, McKernan B, Taylor S, Rossini PGC et al. Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis. In Proceedings of the 51st Hawaii International Conference on System Sciences. 2019 https://doi.org/10.24251/HICSS.2019.051
Boichak, Olga ; Canzonetta, Jordan ; Sitaula, Niraj ; McKernan, Brian ; Taylor, Sarah ; Rossini, Patricia G.C. ; Clegg, B. A. ; K, Kenski ; Martey, Rosa ; McCracken, Nancy ; Oesterlund, Carsten ; Myers, Roc ; Folkestad, James E. ; Stromer-Galley, Jennifer. / Beyond the Medium : Rethinking Information Literacy through Crowdsourced Analysis. Proceedings of the 51st Hawaii International Conference on System Sciences. 2019.
@inproceedings{4d30b6eb82d04930865deb0a424a2e1b,
title = "Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis",
abstract = "Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.",
author = "Olga Boichak and Jordan Canzonetta and Niraj Sitaula and Brian McKernan and Sarah Taylor and Rossini, {Patricia G.C.} and Clegg, {B. A.} and Kenski K and Rosa Martey and Nancy McCracken and Carsten Oesterlund and Roc Myers and Folkestad, {James E.} and Jennifer Stromer-Galley",
year = "2019",
doi = "10.24251/HICSS.2019.051",
language = "English (US)",
booktitle = "Proceedings of the 51st Hawaii International Conference on System Sciences",

}

TY - GEN

T1 - Beyond the Medium

T2 - Rethinking Information Literacy through Crowdsourced Analysis

AU - Boichak, Olga

AU - Canzonetta, Jordan

AU - Sitaula, Niraj

AU - McKernan, Brian

AU - Taylor, Sarah

AU - Rossini, Patricia G.C.

AU - Clegg, B. A.

AU - K, Kenski

AU - Martey, Rosa

AU - McCracken, Nancy

AU - Oesterlund, Carsten

AU - Myers, Roc

AU - Folkestad, James E.

AU - Stromer-Galley, Jennifer

PY - 2019

Y1 - 2019

N2 - Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.

AB - Information literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.

U2 - 10.24251/HICSS.2019.051

DO - 10.24251/HICSS.2019.051

M3 - Conference contribution

BT - Proceedings of the 51st Hawaii International Conference on System Sciences

ER -