Читать книгу Communicating Science in Times of Crisis - Группа авторов - Страница 31
References
Оглавление1 3M. (2020). State of science index: 2020 global report. https://multimedia.3m.com/mws/media/1898512O/3m-sosi-2020-pandemic-pulse-global-report-pdf.pdf
2 Alemanno, A. (2018). How to counter fake news? A taxonomy of anti-fake news approaches. European Journal of Risk Regulation, 9(1), 1–5. https://doi.org/10.1017/err.2018.12
3 Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236. https://pubs.aeaweb.org/doi/pdfplus/10.1257/jep.31.2.211
4 Allcott, H., Gentzkow, M., & Yu, C. (2019). Trends in the diffusion of misinformation on social media. Research and Politics, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
5 Allem, J.-P., & Ferrara, E. (2018). Could social bots pose a threat to public health? American Journal of Public Health, 108(8), 1005–1006. https://doi.org/10.2105/AJPH.2018.304512
6 Al-Rawi, A., Groshek, J., & Zhang, L. (2019). What the fake? Assessing the extent of networked political spamming and bots in the propagation of #fakenews on Twitter. Online Information Review, 43(1), 53–71. https://doi.org/10.1108/OIR-02-2018-0065
7 Alshaabi, T., Arnold, M. V., Minot, J. R., Adams, J. L., Dewjurst, D. R., Reagan, A. J., Muhamad, R., Danforth, C. M., & Dodds, P. S. (2021). How the world’s collective attention is being paid to a pandemic: COVID-19 related 1-gram time series for 24 languages on Twitter. PLoS ONE, 16(1), e0244476. https://doi.org/10.1371/journal.pone.0244476
8 Alzamora, G. C., & Andrade, L. (2019). The transmedia dynamics of fake news by the pragmatic conception of truth. MATRIZes, 13(1), 109–131. https://doi.org/10.11606/issn.1982-8160.v13i1p109-131
9 Andrade, G. (2020). Medical conspiracy theories: Cognitive science and implications for ethics. Medicine, Health Care and Philosophy, 23(3), 505–518. https://doi.org/10.1007/s11019-020-09951-6
10 Asprem, E., & Dyrendal, A. (2015). Conspirituality reconsidered: How surprising and how new is the confluence of spirituality and conspiracy theory? Journal of Contemporary Religion, 30(3), 367–382. https://doi.org/10.1080/13537903.2015.1081339
11 Atlani-Duault, L., Ward, J. K., Roy, M., Morin, C., & Wilson, A. (2020). Tracking online heroization and blame in epidemics. The Lancet Public Health, 5(3), e137-e138. https://doi.org/10.1016/S2468-2667(20)30033-5
12 Avramov, K., Gatov, V., & Yablokov, I. (2020). Conspiracy theories and fake news. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 512–524). Routledge.
13 Baesler, E. J. (1995). Construction and test of an empirical measure for narrative coherence and fidelity. Communication Reports, 8(2), 97–101. https://doi.org/10.1080/08934219509367615
14 Bangerter, A., Wagner-Egger, P., & Delouvée, S. (2020). How conspiracy theories spread. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 206–218). Routledge.
15 Baptista, J. P. & Gradim, A. (2020). Understanding fake news consumption: A review. Social Science, 9(185), 1–22. https://doi.org/10.3390/socsci9100185
16 Berduygina, O. N., Vladimirova, T. N., & Chernyaeva, E. V. (2019). Trends in the spread of fake news in mass media. Media Watch, 10(1), 122–132. https://doi.org/10.15655/mw/2019/v10i1/49561
17 Bessi, A., Zollo, F., Del Vicario, M., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Trend of narratives in the age of misinformation. PLoS ONE, 10(8), 1–16. https://doi.org/10.1371/journal.pone.0134641
18 Bjerg, O., & Presskorn-Thygesen, T. (2017). Conspiracy theory: Truth claim or language game? Theory, Culture & Society, 34(1), 137–159. https://doi.org/10.1177/0263276416657880
19 Black, P. J., Wollis, M., Woodworth, M., & Hancock, J. T. (2015, June). A linguistic analysis of grooming strategies of online child sex offenders: Implications for our understanding of predatory sexual behavior in an increasingly computer-mediated world. Child Abuse & Neglect, 44, 140–149. https://doi.org/10.1016/j.chiabu.2014.12.004
20 Bondielli, A., & Marcelloni, F. (2019). A survey on fake news and rumour detection techniques. Information Sciences, 497, 38–55. https://doi.org/10.1016/j.ins.2019.05.035
21 Bradshaw, S., & Howard, P. N. (2018). The global organization of social media disinformation campaigns. Journal of International Affairs, 71(1.5), 23–31. https://www.jstor.org/stable/26508115
22 Bradshaw, S., Howard, P. N., Kollanyi, B., & Neudert, L.-M. (2020). Sourcing and automation of political news and information over social media in the United States, 2016–2018. Political Communication, 37(2), 173–193. https://doi.org/10.1080/10584609.2019.1663322
23 Brennen, J. S., Simon, F. M., Howard, P. N., & Nielsen, R. K. (2020, April). Types, sources, and claims of COVID-19 misinformation. Reuters Institute for the Study of Journalism, University of Oxford. https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation
24 Britton, T., Ball, F., & Trapman, P. (2020). A mathematical model reveals the influence of population heterogeneity on herd immunity to SARS-CoV-2. Science, 369(6505), 846–849. https://doi.org/10.1126/science.abc6810 @
25 Brody, D. C., & Meier, D. M. (2018). How to model fake news. arXiv:1809.00964v2
26 Brotherton, R. (2013, September). Towards a definition of ‘conspiracy theory’. PSYPAG Quarterly, 88, 9–14. http://www.psypag.co.uk/wp-content/uploads/2013/09/Issue-88.pdf
27 Brotherton, R., & French, C. C. (2014). Belief in conspiracy theories and susceptibility to the conjunction fallacy. Applied Cognitive Psychology, 28(2), 238–248. https://doi.org/10.1002/acp.2995
28 Brotherton, R., & French, C. C. (2015). Intention seekers: Conspiracist ideation and biased attributions of intentionality. PloS One, 10(5), e0124125. https://doi.org/10.1371/journal.pone.0124125
29 Brotherton, R., French, C. C., & Pickering, A. D. (2013, May). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Psychology, 4, 279. https://doi.org/10.3389/fpsyg.2013.00279
30 Bruder, M., Haffke, P., Neave, N., Nouripanah, N., & Imhoff, R. (2013, December). Measuring individual differences in generic beliefs in conspiracy theories across cultures: Conspiracy mentality questionnaire. Frontiers in Psychology, 4. https://doi.org/10.3389/fpsyg.2013.00225
31 Bryant, E. (2008). Real lies, white lies and gray lies: Towards a typology of deception. Kaleidoscope, 7, 23–48. https://digitalcommons.trinity.edu/hct_faculty/6
32 Buller, D. B., & Burgoon, J. K. (1996). Interpersonal deception theory. Communication Theory, 6(3), 203–242. https://doi.org/10.1111/j.1468-2885.1996.tb00127.x
33 Burgess, A. W., & Hartman, C. R. (2018). On the origin of grooming. Journal of Interpersonal Violence, 33(1), 17–23. https://doi.org/10.1177/0886260517742048
34 Butter, M., & Knight, P. (2016). Bridging the great divide: Conspiracy theory research for the 21st century. Diogenes, 1–13. https://doi.org/10.1177/0392192116669289
35 Caballero, E. G. (2020). Social network analysis, social big data and conspiracy theories. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 135–147). Routledge.
36 Calisher, C., Carroll, D., Colwell, R., Corley, R. B., Daszak, P., Drosten, C., Enjuanes, L., Farrar, J., Field, H., Golding, J., Gorbalenya, A., Haagmans, B., Hughes, J. M., Karesh, W. B., Keusch, G. T., Lam, S. K., Lubroth, J., Mackenzie, J. S., Madoff, L., & Turner, M. (2020). Statement in support of the scientists, public health professionals, and medical professionals of China combatting COVID-19. Lancet, 395(10226), e42–e43. https://doi.org/10.1016/S0140-6736(20)30418-9
37 Cantarero, K., Van Tilburg, W. A. P., & Szarota, P. (2018, November). Differentiating everyday lies: A typology of lies based on beneficiary and motivation. Personality and Individual Differences, 134, 252–260. https://doi.org/10.1016/j.paid.2018.05.013
38 Carlson, M. (2020). Fake news as an informational moral panic: The symbolic deviancy of social media during the 2016 US presidential election. Information, Communication & Society, 23(3), 374–388. https://doi.org/10.1080/1369118X.2018.1505934
39 Carmichael, A. G. (1998). The last past plague: The uses of memory in Renaissance epidemics. Journal of the History of Medicine and Allied Sciences, 53(2), 132–160. https://doi.org/10.1093/jhmas/53.2.132
40 Centers for Disease Control and Prevention. (2019). Prioritizing zoonotic diseases for multisectoral, one health collaboration in the United States (Workshop summary). https://www.cdc.gov/onehealth/what-we-do/zoonotic-disease-prioritization/us-workshops.html
41 Chou, W.-Y. S., Oh, A., & Klein, W. M. P. (2018). Addressing health-related misinformation on social media. JAMA, 320(23), 2417–2418. https://doi.org/10.1001/jama.2018.16865
42 Clarke, S. (2002). Conspiracy theories and conspiracy theorizing. Philosophy of the Social Sciences, 32(2), 131. https://doi.org/10.1177/004931032002001
43 Clementson, D. E. (2017). Truth bias and partisan bias in political deception detection. Journal of Language and Social Psychology, 37(4), 407–430. https://doi.org/10.1177%2F0261927X17744004
44 Cohn, S. K., Jr. (2007). The Black Death and the burning of Jews. Past & Present, 196(1), 3–36. https://doi.org/10.1093/pastj/gtm005
45 Cohn, S. K. (2012). Pandemics: Waves of disease, waves of hate from the plague of Athens to A.I.D.S. Historical Research, 85(230), 535–555. https://doi.org/10.1111/j.1468-2281.2012.00603.x
46 Connolly, J. M., Uscinski, J. E., Klofstad, C. A., & West, J. P. (2019). Communicating to the public in the era of conspiracy theory. Public Integrity, 21(5), 469–476. https://doi.org/10.1080/10999922.2019.1603045
47 Corrigan, R., & Denton, P. (1996). Causal understanding as a developmental primitive. Developmental Review, 16(2), 162–202. https://doi.org/10.1006/drev.1996.0007
48 Curtis, D. A., & Hart, C. L. (2020). Deception in psychotherapy: Frequency, typology and relationship. Counselling & Psychotherapy Research, 20(1), 106–115. https://doi.org/10.1002/capr.12263
49 Davis, M. (2019). Uncertainty and immunity in public communications on pandemics. In K. Bjørkdahl & B. Carlsen (Eds.), Pandemics, publics, and politics (pp. 29–42). Palgrave Macmillan. https://doi.org/10.1007/978-981-13-2802-2_3
50 Dawes, G. W. (2018). Identifying pseudoscience: A social process criterion. Journal for General Philosophy of Science, 49(3), 283–298. https://doi.org/10.1007/s10838-017-9388-6
51 de Regt, A., Montecchi, M., & Lord Ferguson, S. (2020). A false image of health: How fake news and pseudo-facts spread in the health and beauty industry. Journal of Product & Brand Management, 29(2), 168–179. https://doi.org/10.1108/JPBM-12-2018-2180
52 de Santisteban, P., del Hoyo, J., Alcázar-Córcoles, M. Á., & Gámez-Guadix, M. (2018). Progression, maintenance, and feedback of online child sexual grooming: A qualitative analysis of online predators. Child Abuse & Neglect, 80, 203–215. https://doi.org/10.1016/j.chiabu.2018.03.026
53 Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113(3), 554–559. https://doi.org/10.1073/pnas.1517441113
54 Dietz, P. (2018). Grooming and seduction. Journal of Interpersonal Violence, 33(1), 28–36. https://doi.org/10.1177/0886260517742060
55 DiMaggio, P. J. (1995). Comments on ‘What theory is not. Administrative Science Quarterly, 40(3), 391–397. https://doi.org/10.2307/2393790, https://www.jstor.org/stable/2393790
56 Douglas, K. M., & Sutton, R. M. (2011). Does it take one to know one? Endorsement of conspiracy theories is influenced by personal willingness to conspire. The British Journal of Social Psychology, 50(3), 544–552. https://doi.org/10.1111/j.2044-8309.2010.02018.x
57 Douglas, K. M., Sutton, R. M., Callan, M. J., Dawtry, R. J., & Harvey, A. J. (2016). Someone is pulling the strings: Hypersensitive agency detection and belief in conspiracy theories. Thinking & Reasoning, 22(1), 57–77. https://doi.org/10.1080/13546783.2015.1051586
58 Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding conspiracy theories. Advances in Political Psychology, 40(suppl. 1), 3–35. https://doi.org/10.1111/pops.12568
59 Drinkwater, K. G., Dagnall, N., Denovan, A., & Neave, N. (2020). Psychometric assessment of the generic conspiracist beliefs scale. PLoS ONE, 15(3), 1–19. https://doi.org/10.1371/journal.pone.0230365
60 Durodolu, O. O., & Ibenne, S. K. (2020, June). The fake news infodemic vs information literacy. Library Hi Tech, 37(7), 13–14. https://doi.org/10.1108/LHTN-03-2020-0020
61 Edy, J. A., & Risley-Baird, E. E. (2016a). Misperceptions as political conflict: Using Schattschneider’s conflict theory to understand rumor dynamics. International Journal of Communication, 10, 2596–2615. https://ijoc.org/index.php/ijoc/article/view/4430/1668
62 Edy, J. A., & Risley-Baird, E. E. (2016b). Rumor communities: The social dimensions of internet political misperceptions. Social Science Quarterly, 97(3), 588–602. https://doi.org/10.1111/ssqu.12309
63 Effron, D. A., & Raj, M. (2020). Misinformation and morality: Encountering fake-news headlines makes them seem less unethical to publish and share. Psychological Science (0956-7976), 31(1), 75–87. https://doi.org/10.1177/0956797619887896
64 Ekman, P., & O’Sullivan, M. (2006). From flawed self-assessment to blatant whoppers: The utility of voluntary and involuntary behavior in detecting deception. Behavioral Sciences & the Law, 24(5), 673–686. https://doi.org/10.1002/bsl.729
65 Enders, A. M., & Smallpage, S. M. (2019). Informational cues, partisan-motivated reasoning, and the manipulation of conspiracy beliefs. Political Communication, 36(1), 83–102. https://doi.org/10.1080/10584609.2018.1493006
66 Erat, S., & Gneezy, U. (2012). White lies. Management Science, 58(4), 723–733. https://doi.org/10.1287/mnsc.1110.1449
67 Fasce, A., & Picó, A. (2019a). Conceptual foundations and validation of the Pseudoscientific Belief Scale. Applied Cognitive Psychology, 33(4), 617–628. https://doi.org/10.1002/acp.3501
68 Fasce, A., & Picó, A. (2019b). Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education, 28(1–2), 109–125. https://doi.org/10.1007/s11191-018-00022-0
69 Ferrara, E. (2020a). #COVID-19 on Twitter: Bots, conspiracies, and social media activism. arXiv preprint. https://arxiv.org/ct?url=https%3A%2F%2Fdx.doi.org%2F10.5210%2Ffm.v25i6.10633&v=da141594
70 Ferrara, E. (2020b). What types of COVID-19 conspiracies are populated by Twitter bots? First Monday, 25(6). https://doi.org/10.5210/fm.v25i6.10633
71 Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96–104. https://doi.org/10.1145/2818717
72 Ferreira, C. C., Robertson, J., & Kirsten, M. (2020). The truth (as I see it): Philosophical considerations influencing a typology of fake news. Journal of Product & Brand Management, 29(2), 150–158. https://doi.org/10.1108/JPBM-12-2018-2149
73 Feyerabend, P. (1980). How to defend society against science. In E. D. Klemke, R. Hollinger, & A. D. Kline (Eds.), Introductory readings in the philosophy of science (pp. 55–65). Prometheus.
74 Finley, T., & Koyama, M. (2018). Plague, politics, and pogroms: The Black Death, the rule of law, and the persecution of Jews in the Holy Roman Empire. Journal of Law & Economics, 61(2), 253–277. https://doi.org/10.1086/699016
75 Fisher, W. R. (1980). Rationality and the logic of good reasons. Philosophy & Rhetoric, 13(2), 121–130. https://www.jstor.org/stable/40237140
76 Fisher, W. R. (1985a). The narrative paradigm: An elaboration. Communication Monographs, 52(4), 347–367. https://doi.org/10.1080/03637758509376117
77 Fisher, W. R. (1985b). The narrative paradigm: In the beginning. Journal of Communication, 35(4), 74–89. https://doi.org/10.1111/j.1460-2466.1985.tb02974.x
78 Flack, J. C., & de Waal, F. (2007). Context modulates signal meaning in primate communication. Proceedings of the National Academy of Sciences of the United States of America, 104(5), 1581–1586. https://doi.org/10.1073/pnas.0603565104
79 Franks, B., Bangerter, A., & Bauer, M. W. (2013, July). Conspiracy theories as quasireligious mentality: An integrated account from cognitive science, social representations theory, and frame theory. Frontiers in Psychology, 4, 424. https://doi.org/10.3389/fpsyg.2013.00424
80 Franks, B., Bangerter, A., Bauer, M. W., Hall, M., & Noort, M. C. (2017, June). Beyond “monologicality”? exploring conspiracist worldviews. Frontiers in Psychology, 8, 861. https://doi.org/10.3389/fpsyg.2017.00861
81 Freeman, D., Waite, F., Rosebrock, L., Petit, A., Causier, C., East, A., Jenner, L., Teale, A.-L., Carr, L., Mulhall, S., Bold, E., & Lambe, S. (2020a). Coronavirus conspiracy beliefs, mistrust, and compliance with government guidelines in England. Psychological Medicine. Online first. https://doi.org/10.1017/S0033291720001890
82 Freeman, D., Waite, F., Rosebrock, L., Petit, A., Causier, C., East, A., Jenner, L., Teale, A.-L., Carr, L., Mulhall, S., Bold, E., & Lambe, S. (2020b). We should beware of ignoring uncomfortable possible truths (a reply to McManus et al). Psychological Medicine. Online first. https://doi.org/10.1017/S0033291720002196
83 Gallotti, R., Sacco, P. L., & De Domenico, M. (2020). Assessing the risks of “infodemics” in response to COVID-19 epidemics. Nature Human Behavior, 4, 1285–1293. https://doi.org/10.1101/2020.04.08.20057968
84 Gámez-Guadix, M., Almendros, C., Calvete, E., & De Santisteban, P. (2018, February). Persuasion strategies and sexual solicitations and interactions in online sexual grooming of adolescents: Modeling direct and indirect pathways. Journal of Adolescence, 63, 11–18. https://doi.org/10.1016/j.adolescence.2017.12.002
85 Garrett, B., Murphy, S., Jamal, S., MacPhee, M., Reardon, J., Cheung, W., Mallia, E., & Jackson, C. (2019). Internet health scams—Developing a taxonomy and risk‐of‐deception assessment tool. Health & Social Care in the Community, 27(1), 226–240. https://doi.org/10.1111/hsc.12643
86 Gebauer, F., Raab, M. H., & Carbon, C. (2016). Conspiracy formation is in the detail: On the interaction of conspiratorial predispositions and semantic cues. Applied Cognitive Psychology, 30(6), 917–924. https://doi.org/10.1002/acp.3279
87 Geschke, D., Lorenz, J., & Holtz, P. (2019). The triple‐filter bubble: Using agent‐based modelling to test a meta‐theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology, 58(1), 129–149. https://doi.org/10.1111/bjso.12286
88 Giglietto, F., Iannelli, L., Valeriani, A., & Rossi, L. (2019). “Fake news” is the invention of a liar: How false information circulates within the hybrid news system. Current Sociology, 67(4), 625–642. https://doi.org/10.1177/0011392119837536
89 Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology, 15(4), 733–744. https://doi.org/10.2307/3791630
90 Goreis, A. & Voracek, M. (2019, February). A systematic review and meta-analysis of psychological research on conspiracy beliefs: Field characteristics, measurement instruments, and associations with personality traits. Frontiers in Psychology, 10, 1–13. https://doi.org/10.3389/fpsyg.2019.00205
91 Graham, T., Bruns, A., Zhu, G., & Campbell, R. (2020, May). Like a virus: The coordinated spread of coronavirus disinformation. Australia Institute. https://apo.org.au/sites/default/files/resource-files/2020-06/apo-nid305864.pdf
92 Grimes, D. R. (2016). Correction: On the viability of conspiratorial beliefs. PLoS ONE, 11(3), e0151003. https://doi.org/10.1371/journal.pone.0151003
93 Guadagno, R. E., Rempala, D. M., Murphy, S., & Okdie, B. M. (2013). What makes a video go viral? An analysis of emotional contagion and internet memes. Computers in Human Behavior, 29(6), 2312–2319. https://doi.org/10.1016/j.chb.2013.04.016
94 Guarda, R. F., Ohlson, M. P., Romanini, A. V., & Martínez-Ávila, D. (2018). Disinformation, dystopia and post-reality in social media: A semiotic-cognitive perspective. Education for Information, 34(3), 185–197. https://doi.org/10.3233/EFI-180209
95 Hameleers, M., Powell, T. E., Van Der Meer, T. G. L. A., & Bos, L. (2020). A picture paints a thousand lies? The effects and mechanisms of multimodal disinformation and rebuttals disseminated via social media. Political Communication, 37(2), 281–301. https://doi.org/10.1080/10584609.2019.1674979
96 Hart, J., & Graether, M. (2018). Something’s going on here: Psychological predictors of belief in conspiracy theories. Journal of Individual Differences, 39(4), 229–237. https://doi.org/10.1027/1614-0001/a000268
97 Hastak, M., & Mazis, M. B. (2011). Deception by implication: A typology of truthful but misleading advertising and labeling claims. Journal of Public Policy & Marketing, 30(2), 157–167. https://doi.org/10.1509/jppm.30.2.157
98 Hawes, L. (1975). Pragmatics of analoguing: Theory and model construction in communication. Addison-Wesley.
99 Hill, S. A., Laythe, B., Dagnall, N., Drinkwater, K., O’Keeffe, C., Ventola, A., & Houran, J. (2019). “Meme-spirited”: II. Illustrating the VAPUS model for ghost narratives. Australian Journal of Parapsychology, 19(1), 5–43. https://search.informit.com.au/documentSummary;dn=416415346494840;res=IELHSS
100 Hofstadter, R. (1964). Paranoid style in American politics. Harper’s Magazine, 229(11), 77–86. https://harpers.org/archive/1964/11/the-paranoid-style-in-american-politics
101 Hopper, R., & Bell, R. A. (1984). Broadening the deception construct. Quarterly Journal of Speech, 70(3), 288–302. https://doi.org/10.1080/00335638409383698
102 Humprecht, E. (2019). Where “fake news” flourishes: A comparison across four Western democracies. Information, Communication & Society, 22(13), 1973–1988. https://doi.org/10.1080/1369118X.2018.1474241
103 Huneman, P. & Vorms, M. (2018). Is a unified account of conspiracy thories possible? Argumenta, 3(2), 247–270. https://doi:10.23811/54.arg2017.hun.vor
104 Imhoff, R., & Lamberty, P. (2018). How paranoid are conspiracy believers? Toward a more fine‐grained understanding of the connect and disconnect between paranoia and belief in conspiracy theories. European Journal of Social Psychology, 48(7), 909–926. https://doi.org/10.1002/ejsp.2494
105 Introne, J., Yildirim, I. G., Iandoli, L., DeCook, J., & Elzeeini, S. (2018). How people weave online information into pseudoknowledge. Social Media + Society, 4(3), 1–15. https://doi.org/10.1177/2056305118785639
106 Jang, S. M., Geng, T., Queenie Li, J.-Y., Xia, R., Huang, C.-T., Kim, H., & Tang, J. (2018, July). A computational approach for examining the roots and spreading patterns of fake news: Evolution tree analysis. Computers in Human Behavior, 84, 103–113. https://doi.org/10.1016/j.chb.2018.02.032
107 Jin, F., Wang, W., Zhao, L., Dougherty, E. R., Cao, Y., Lu, C. T., & Ramakrishnan, N. (2014). Misinformation propagation in the age of Twitter. IEEE Computer, 47 (12), 90–94. https://doi.org/10.1109/MC.2014.361
108 Jolley, D., Douglas, K. M., & Sutton, R. M. (2018). Blaming a few bad apples to save a threatened barrel: The system‐justifying function of conspiracy theories. Political Psychology, 39(2), 465–478. https://doi.org/10.1111/pops.12404
109 Kalyanam, J., Velupillai, S., Doan, S., Conway, M., & Lanckriet, G. (2015, August). Facts and fabrications about Ebola: A Twitter based study. KDD 2015. http://eceweb.ucsd.edu/~gert/papers/KDD_BigCHat_2015.pdf
110 Karlova, N. A., & Fisher, K. E. (2013). A social diffusion model of misinformation and disinformation for understanding human information behaviour. Information Research, 18(1), 1–12. http://InformationR.net/ir/18-1/paper573.html
111 Kavanagh, J., & Rich, M. D. (2018). Truth decay: An initial exploration of the diminishing role of facts and analysis in American public life. Rand Corporation.
112 Kearney, M. D., Selvan, P., Hauer, M. K., Leader, A. E., & Massey, P. M. (2019). Characterizing HPV vaccine sentiments and content on Instagram. Health Education & Behavior, 46(2), 37–48. https://doi.org/10.1177/1090198119859412
113 Keeley, B. L. (1999). Of conspiracy theories. Journal of Philosophy, 96(3), 109–126. https://dx.doi.org/10.2139/ssrn.1084585
114 Kim, A., & Dennis, A. R. (2019). Says who? The effects of presentation format and source rating on fake news in social media. MIS Quarterly, 43(3), 1025–1039. https://doi.org/10.25300/MISQ/2019/15188
115 Klein, C., Clutton, P., & Dunn, A. G. (2019). Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit’s conspiracy theory forum. PloS One, 14(11), e0225098. https://doi.org/10.1371/journal.pone.0225098
116 Klein, C., Clutton, P., & Polito, V. (2018, February). Topic modeling reveals distinct interests within an online conspiracy forum. Frontiers in Psychology, 9, 189. https://doi.org/10.3389/fpsyg.2018.00189
117 Kobayashi, K., & Hsu, M. (2017). Neural mechanisms of updating under reducible and irreducible uncertainty. The Journal of Neuroscience, 37(29), 6972–6982. https://doi.org/10.1523/JNEUROSCI.0535-17.2017
118 Kopp, C., Korb, K. B., & Mills, B. I. (2018). Information-theoretic models of deception: Modelling cooperation and diffusion in populations exposed to “fake news.” PloS One, 13(11), e0207383. https://doi.org/10.1371/journal.pone.0207383
119 Koro-Ljungberg, M., Carlson, D. L., Montana, A., & Cheek, J. (2019). Productive forces of post-truth(s)? Qualitative Inquiry, 25(6), 583–590. https://doi.org/10.1177/1077800418806595
120 Lanning, K. (2018). The evolution of grooming: Concept and term. Journal of Interpersonal Violence, 33(1), 5–16. https://doi.org/10.1177/0886260517742046
121 Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2017). ‘I know things they don’t know!’: The role of need for uniqueness in belief in conspiracy theories. Social Psychology, 48(3), 160–173. https://doi.org/10.1027/1864-9335/a000306
122 Lantian, A., Muller, D., Nurra, C., Klein, O., Berjot, S., & Pantazi, M. (2018). Stigmatized beliefs: Conspiracy theories, anticipated negative evaluation of the self, and fear of social exclusion. European Journal of Social Psychology, 48(7), 939–954. https://doi.org/10.1002/ejsp.2498
123 Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news: Addressing fake news requires a multidisciplinary effort. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998
124 Leal, H. (2020). Networked disinformation and the lifecycle of online conspiracy theories. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 497–511). Routledge.
125 Leone, M., Madison, M.-L., & Ventsel, A. (2020). Semiotic approaches to conspiracy theories. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 43–55). Routledge.
126 Levine, T. R., Ali, M. V., Dean, M., Abdulla, R. A., & Garcia-Ruano, K. (2016). Toward a pan-cultural typology of deception motives. Journal of Intercultural Communication Research, 45(1), 1–12. https://doi.org/10.1080/17475759.2015.1137079
127 Limperos, A. M., & Silberman, W. (2019). Agenda setting in the age of emergent online media and social networks: Exploring the dangers of a news agenda influenced by subversive and fake information. In E. Downs (Ed.), Dark side of media & technology (pp. 37–48). Peter Lang.
128 Lin, L., Savoia, E., Agboola, F., & Viswanath, K. (2014). What have we learned about communication inequalities during the H1N1 pandemic: A systematic review of the literature. BMC Public Health, 14(1), 535–558. https://doi.org/10.1186/1471-2458-14-484
129 Lobato, E., Mendoza, J., Sims, V., & Chin, M. (2014). Examining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology, 28(5), 617–625. https://doi.org/10.1002/acp.3042
130 Lukito, J., Suk, J., Zhang, Y., Doroshenko, L., Kim, S. J., Su, M.-H., Xia, Y., Freelon, D., & Wells, C. (2020). The wolves in sheep’s clothing: How Russia’s internet research agency tweets appeared in U.S. news as vox populi. International Journal of Press/Politics, 25(2), 196–216. https://doi.org/10.1177/1940161219895215
131 Madisson, M.-L. (2014). The semiotic logic of signification of conspiracy theories. Semiotica, 2014 (202), 273–300. https://doi.org/10.1515/sem-2014-0059
132 Magarey, R. D., & Trexler, C. M. (2020). Information: A missing component in understanding and mitigating social epidemics. Humanities & Social Sciences Communications, 7(128), 1–12. https://doi.org/10.1057/s41599-020-00620-w
133 Mahmoud, H. (2020). A model for the spreading of fake news. Journal of Applied Probability, 57(1), 332-342. https://doi.org/10.1017/jpr.2019.103
134 Malena-Chan, R. (2019). A narrative model for exploring climate change engagement among young community leaders. Health Promotion and Chronic Disease Prevention in Canada: Research, Policy and Practice, 39(4), 157–166. https://doi.org/10.24095/hpcdp.39.4.07
135 McKenzie-McHarg, A. (2020). Conceptual history and conspiracy theory. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 16–27). Routledge.
136 McManus, S., D’Ardenne, J., & Wessely, S. (2020). Covid conspiracies: Misleading evidence can be more damaging than no evidence at all. Psychological Medicine, 1-2. Online first. https://doi.org/10.1017/S0033291720002184
137 Mercier, H., Majima, Y., & Miton, H. (2018). Willingness to transmit and the spread of pseudoscientific beliefs. Applied Cognitive Psychology, 32(4), 499–505. https://doi.org/10.1002/acp.3413
138 Mitchell, A., & Oliphant, J. B., Pew (2020, March 18). Americans immersed in COVID-19 news. Pew Research Center.
139 Moulding, R., Nix-Carnell, S., Schnabel, A., Nedeljkovic, M., Burnside, E. E., Lentini, A. F., & Mehzabin, N. (2016, August). Better the devil you know than a world you don’t? Intolerance of uncertainty and worldview explanations for belief in conspiracy theories. Personality and Individual Differences, 98, 345–354. https://doi.org/10.1016/j.paid.2016.04.060
140 Nawrat, A. (2020, February 26). Covid-19 outbreak: How misinformation could fuel global panic. Pharmaceutical Technology. https://www.pharmaceutical-technology.com/features/covid-19-outbreak-how-misinformation-could-spark-global-panic
141 Nguyen, H. (2020, March 2). Americans are taking action in response to coronavirus. YouGov. https://today.yougov.com/topics/health/articles-reports/2020/03/02/americans-are-taking-action-response-coronavirus
142 Nimmo, B., François, C., Eib, C. S., Ronzaud, L., Ferreira, R., Hernon, C., & Kostelancik, T. (2020, June 17). Secondary infection. Graphika. https://mediawell.ssrc.org/2020/06/18/secondary-infektion-at-a-glance
143 O’Hair, H. D., & Cody, M. J. (1994). Deception. In W. R. Cupach & B. H. Spitzberg (Eds.), The dark side of interpersonal communication (pp. 181–213). Lawrence Erlbaum Associates.
144 O’Sullivan, P. B., & Carr, C. T. (2018). Masspersonal communication: A model bridging the mass-interpersonal divide. New Media and Society, 20(3), 1161–1180. https://doi.org/10.1177/1461444816686104
145 Oliver, J. E., & Wood, T. (2014). Conspiracy theories and mass opinion. American Journal of Political Science, 58(4), 952–966. https://doi.org/10.1111/ajps.12084
146 Orso, D., Federici, N., Copetti, R., Vetrugno, L., & Bove, T. (2020). Infodemic and the spread of fake news in the COVID-19-era. European Journal of Emergency Medicine, 27(5), 327-328. https://doi.org/10.1097/MEJ.0000000000000713
147 Oxford University Press. (n.d.). Word of the year. Author. https://languages.oup.com/word-of-the-year
148 Oyeyemi, S. O., Gabarron, E., & Wynn, R. (2014, October). Ebola, Twitter, and misinformation: A dangerous combination? British Medical Journal, 349, g6178. https://doi.org/10.1136/bmj.g6178
149 Patev, A. J., Hood, K. B., Speed, K. J., Cartwright, P. M., & Kinman, B. A. (2019). HIV conspiracy theory beliefs mediates the connection between HIV testing attitudes and HIV prevention self-efficacy. Journal of American College Health, 67(7), 661–673. https://doi.org/10.1080/07448481.2018.1500472
150 Pawlick, J., Colbert, E., & Zhu, Q. (2019). A game-theoretic taxonomy and survey of defensive deception for cybersecurity and privacy. ACM Computing Surveys, 52(4), 1–28. https://doi.org/10.1145/3337772
151 Pearson, G. (2020). Sources on social media: Information context collapse and volume of content as predictors of source blindness. New Media & Society. Online first. https://doi.org/10.1177/1461444820910505
152 Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865–1880. https://doi.org/10.1037/xge0000465.supp
153 Pentland, B. T. (1999). Building process theory with narrative: From description to explanation. Academy of Management Review, 24(4), 711–724. https://doi.org/10.5465/AMR.1999.2553249
154 Peterson, H. L. (2018). Political information has bright colors: Narrative attention theory. Policy Studies Journal, 46(4), 828–842. https://doi.org/10.1111/psj.12272
155 Pigden, C. (1995). Popper revisited, or what is wrong with conspiracy theories? Philosophy of the Social Sciences, 25(1), 3. https://doi.org/10.1177/004839319502500101
156 Pomerantsev, P., & Weiss, M. (2014). The menace of unreality: How the Kremlin weaponizes information, culture and money. Institute of Modern Russia.
157 Popper, K. (1980). Science: Conjectures and refutations. In E. D. Klemke, R. Hollinger, & A. D. Kline (Eds.), Introductory readings in the philosophy of science (pp. 19–34). Prometheus.
158 Porter, C. M. (2014). The Black Death and persecution of the Jews. Saber and Scroll, 3(1), 55–65. https://saberandscroll.weebly.com/uploads/1/1/7/9/11798495/3.1._a4.pdf
159 Porter, S., Bellhouse, S., McDougall, A., ten Brinke, L., & Wilson, K. (2010). A prospective investigation of the vulnerability of memory for positive and negative emotional scenes to the misinformation effect. Canadian Journal of Behavioural Science, 42(1), 55–61. https://doi.org/10.1037/a0016652
160 Quinn, S. C., Hilyard, K. M., Jamison, A. M., An, J., Hancock, G. R., Musa, D., & Freimuth, V. S. (2017). The influence of social norms on flu vaccination among African American and White adults. Health Education Research, 32(6), 473–486. https://doi.org/10.1093/her/cyx070
161 Raab, M. H., Auer, N., Ortlieb, S. A., & Carbon, -C.-C. (2013, July). The Sarrazin effect: The presence of absurd statements in conspiracy theories makes canonical information less plausible. Frontiers in Psychology, 4, 453. https://doi.org/10.3389/fpsyg.2013.00453
162 Raderstorf, B., & Camilleri, M. J. (2019, June). Online disinformation in the United States: Implications for Latin America. Peter D. Bell Rule of Law Program, Inter-American Dialogue.
163 Rampersad, G., & Althiyabi, T. (2020). Fake news: Acceptance by demographics and culture on social media. Journal of Information Technology & Politics, 17(1), 1–11. https://doi.org/10.1080/19331681.2019.1686676
164 Raspe, L. (2004). The Black Death in Jewish sources. Jewish Quarterly Review, 94(3), 471–489. https://doi.org/10.1353/jqr.2004.0001
165 Reyna, V. F. (2020). A scientific theory of gist communication and misinformation resistance, with implications for health, education, and policy. Proceedings of the National Academy of Sciences of the United States of America. https://doi.org/10.1073/pnas.1912441117
166 Romer, D. & Jamieson, K. H. (2020). Conspiracy theories as barriers to controlling the spread of COVID-19 in the U.S. Social Science & Medicine, 263(113356), 1–8. https://doi.org/10.1016/j.socscimed.2020.113356
167 Rommer, D., Majerova, J., & Machova, V. (2020). Repeated COVID-19 pandemic-related media consumption: Minimizing sharing of nonsensical misinformation through health literacy and critical thinking. Linguistic and Philosophical Investigations, 19, 107–113. https://doi.org/10.22381/LPI1920207
168 Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26(5), 521–562. https://doi.org/10.1207/s15516709cog2605_1
169 Rubin, V. L. (2019). Disinformation and misinformation triangle: A conceptual model for “fake news” epidemic, causal factors and interventions. Journal of Documentation, 75(5), 1013–1034. https://doi.org/10.1108/JD-12-2018-0209
170 Ryan, C. D., Schaul, A. J., Butner, R., & Swarthout, J. T. (2020). Monetizing disinformation in the attention economy: The case of genetically modified organisms (GMOs). European Management Journal, 38(1), 7–18. https://doi.org/10.1016/j.emj.2019.11.002
171 Salminen, J., Sengün, S., Corporan, J., Jung, S-g., Jansen, B. J. (2020). Topic-driven toxicity: Exploring the relationship between online toxicity and news topics. PLoS ONE 15(2). e0228723. https://doi.org/10.1371/journal.pone.0228723
172 Savelli, M. R. (2016). The virtual marketplace of misinformation. Journal of Technologies in Knowledge Sharing, 12(3/4), 20–27.
173 Schaeffer, K. (2020, April). Nearly three-in-ten Americans believe COVID-19 was made in a lab. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/04/08/nearly-three-in-ten-americans-believe-covid-19-was-made-in-a-lab
174 Schefulele, D. A., & Krause, N. M. (2019). Science audiences, misinformation, and fake news. Proceedings of the National Academy of Sciences of the United States of America, 116(16), 7662–7669. https://doi.org/10.1073/pnas.1805871115
175 Sell, T. K., Hosangadi, D., & Trotochaud, M. (2020). Misinformation and the US Ebola communication crisis: Analyzing the veracity and content of social media messages related to a fear-inducing infectious disease outbreak. BMC Public Health, 20(1), 1–10. https://doi.org/10.1186/s12889-020-08697-3
176 Seymour, B., Getman, R., Saraf, A., Zhang, L. H., & Kalenderian, E. (2015). When advocacy obscures accuracy online: Digital pandemics of public health misinformation through an antifluoride case study. American Journal of Public Health, 105(3), 517–523. https://doi.org/10.2105/AJPH.2014.302437
177 Shaffer, V. A., Focella, E. S., Hathaway, A., Scherer, L. D., & Zikmund-Fisher, B. J. (2018). On the usefulness of narratives: An interdisciplinary review and theoretical model. Annals of Behavioral Medicine, 52(5), 429–442. https://doi.org/10.1093/abm/kax008
178 Shaffer, V. A., & Zikmund-Fisher, B. J. (2013). All stories are not alike: A purpose-, content-, and valence-based taxonomy of patient narratives in decision aids. Medical Decision Making, 33(1), 4–13. https://doi.org/10.1177/0272989X12463266
179 Shahsavari, S., Holur, P., Tangherlini, T. R., & Roychowdhury, V. (2020). Conspiracy in the time of Corona: Automatic detection of COVID-19 conspiracy theories in the media and the news. arXiv:2004.13783v1 [cs.CL] 28 Apr 2020. https://arxiv.org/pdf/2004.13783.pdf
180 Shao, C., Ciampaglia, G. L., Varol, O., Yang, K.-C., Flammini, A., & Menczer, F. (2018). The spread of low-credibility content by social bots. Nature Communications, 9(1), 4787. https://doi.org/10.1038/s41467-018-06930-7
181 Sharma, K., Qian, F., Jiang, H., Ruchansky, N., Zhang, M., & Liu, Y. (2019). Combating fake news: A survey on identification and mitigation techniques. ACM Transactions on Intelligent Systems & Technology, 10(3), 1–42. https://doi.org/10.1145/3305260
182 Sheares, G., Miklencicova, R., & Grupac, M. (2020). The viral power of fake news: Subjective social insecurity, COVID-19 damaging misinformation, and baseless conspiracy theories. Linguistic and Philosophical Investigations, 19, 12–127. https://doi.org/10.22381/LPI1920209
183 Shepherd, D. A., & Suddaby, R. (2017). Theory building: A review and integration. Journal of Management, 43(1), 59–86. https://doi.org/10.1177/0149206316647102
184 Shin, J., Jian, L., Driscoll, K., & Bar, F. (2018, June). The diffusion of misinformation on social media: Temporal pattern, message, and source. Computers in Human Behavior, 83, 278–287. https://doi.org/10.1016/j.chb.2018.02.008
185 Simon, H. A. (1971). Designing organizations for an information-rich world. In M. Greenberger (Ed.), Computers, communications, and the public interest (pp. 37–72). Johns Hopkins Press.
186 Sommariva, S., Vamos, C., Mantzarlis, A., Uyên-Loan Đào, L., & Tyson, D. M. (2018). Spreading the (fake) news: Exploring health messages on social media and the implications for health professionals using a case study. American Journal of Health Education, 49(4), 246–255. https://doi.org/10.1080/19325037.2018.1473178
187 Southwell, B. G., Niederdeppe, J., Cappella, J. N., Gaysynsky, A., Kelley, D. E., Oh, A., Peterson, E. B., & Chou, W.-Y. S. (2019). Misinformation as a misunderstood challenge to public health. American Journal of Preventive Medicine, 57(2), 282–285. https://doi.org/10.1016/j.amepre.2019.03.009
188 Spitzberg, B. H. (2001). The status of attribution theory qua theory in personal relationships. In V. Manusov, & J. H. Harvey (Eds.), Attribution, communication behavior, and close relationships (pp. 353–371). Cambridge University Press.
189 Spitzberg, B. H. (2014). Toward a model of meme diffusion (M3D). Communication Theory, 24(3), 311–339. https://doi.org/10.1111/comt.12042
190 Spitzberg, B. H. (2019). Traces of pace, place and space in personal relationships: The chronogeometrics of studying relationships at scale. Personal Relationships, 26(2), 184–208. https://doi.org/10.1111/pere.12280
191 Spitzberg, B. H. (in press). Theorizing big social media data: A multilevel model of meme diffusion 2.0 (M3D2.0). In M.-H. Tsou & A. Nara (Eds.), Empowering human dynamics research with social media and geospatial data analytics. Springer.
192 Stano, S. (2020). The internet and the spread of conspiracy content. In M. Butter & P. Knight (Eds.), Routledge handbook of conspiracy theories (pp. 483–496). Routledge.
193 Stone, M., Aravopoulou, E., Evans, G., Aldhaen, E., & Parnell, B. D. (2019). From information mismanagement to misinformation—The dark side of information management. Bottom Line: Managing Library Finances, 32(1), 47–70. https://doi.org/10.1108/BL-09-2018-0043
194 Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202–227. https://doi.org/10.1111/j.1467-9760.2008.00325.x
195 Swami, V., Barron, D., Weis, L., Voracek, M., Stieger, S., & Furnham, A. (2017). An examination of the factorial and convergent validity of four measures of conspiracist ideation, with recommendations for researchers. PLoS ONE, 12(2), 1–27. https://doi.org/10.1371/journal.pone.0172617
196 Swami, V., Furnham, A., Smyth, N., Weis, L., Lay, A., & Clow, A. (2016, September). Putting the stress on conspiracy theories: Examining associations between psychological stress, anxiety, and belief in conspiracy theories. Personality and Individual Differences, 99, 72–76. https://doi.org/10.1016/j.paid.2016.04.084
197 Talwar, S., Dhir, A., Kaur, P., Zafar, N., & Alrasheedy, M. (2019, November). Why do people share fake news? Associations between the dark side of social media use and fake news sharing behavior. Journal of Retailing & Consumer Services, 51, 72–82. https://doi.org/10.1016/j.jretconser.2019.05.026
198 Tandoc, E. C., Jr., Lim, Z. W., & Ling, R. (2017). Defining “fake news”: A typology of scholarly definitions. Digital Journalism, 6(2), 1–17. https://doi.org/10.1080/21670811.2017.1360143
199 Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS ONE, 13(9), 1–21. https://doi.org/10.1371/journal.pone.0203958
200 Torres, R., Gerhart, N. & Negahban, A. (2018, January). Combating fake news: An investigation of information verification behaviors on social networking sites. Proceedings of the 51st Hawaii International Conference on System Sciences, 3976–3985. http://hdl.handle.net/10125/50387
201 Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018, March). Social media, political polarization, and political disinformation: A review of the scientific literature. William + Flora Hewlett Foundation. https://www.hewlett.org/wp-content/uploads/2018/03/Social-Media-Political-Polarization-and-Political-Disinformation-Literature-Review.pdf
202 Twenge, J., & Spitzberg, B. H. (2020). Declines in non-digital social interaction among Americans, 2003–2017. Journal of Applied Social Psychology, 6(1), 329–345. https://doi.org/10.1111/jasp.12665
203 Twenge, J., Spitzberg, B. H., & Campbell, W. K. (2019). Less in-person social interaction with peers among U.S. adolescents in the 21st century and links to loneliness. Journal of Social and Personal Relationships, 36(6), 1892–1913. https://doi.org/10.1177/0265407519836170
204 Twenge, J. M., Martin, G., & Spitzberg, B. H. (2019). Trends in U.S. adolescents’ media use, 1976–2015: The rise of the Internet, the decline of TV, and the (near) demise of print. Psychology of Popular Media Culture, 8(4), 329–345. https://doi.org/10.1037/ppm0000203
205 UNESCO. (2018). Journalism, ‘fake news’ & disinformation. United Nations Educational, Scientific and Cultural Organization. https://en.unesco.org/fightfakenews
206 Uscinski, J. E. (2018). The study of conspiracy theories. Argumenta, 3(2), 233–245. https://doi.org/10.23811/53.arg2017.usc
207 Uscinski, J. E., Enders, A. M., Klofstad, C., Seelig, M., Funchion, J., Everett, C., Wuchty, S., Premaratne, K., & Murthi, M. (2020, April 28). Why do people believe COVID-19 conspiracy theories? The Harvard Kennedy School, Misinformation Review, 1. https://doi.org/10.37016/mr-2020-015
208 Van Heekeren, M. (2020). The curative effect of social media on fake news: A historical re-evaluation. Journalism Studies, 21(3), 306–318. https://doi.org/10.1080/1461670X.2019.1642136
209 van Prooijen, J. (2016). Sometimes inclusion breeds suspicion: Self‐uncertainty and belongingness predict belief in conspiracy theories. European Journal of Social Psychology, 46(3), 267–279. https://doi.org/10.1002/ejsp.2157
210 van Prooijen, J. (2017). Why education predicts decreased belief in conspiracy theories. Applied Cognitive Psychology, 31(1), 50–58. https://doi.org/10.1002/acp.3301
211 van Prooijen, J., & Acker, M. (2015). The influence of control on belief in conspiracy theories: Conceptual and applied extensions. Applied Cognitive Psychology, 29(5), 753–761. https://doi.org/10.1002/acp.3161
212 van Prooijen, J.-W., & Douglas, K. M. (2018). Belief in conspiracy theories: Basic principles of an emerging research domain. European Journal of Social Psychology, 48(7), 897–908. https://doi.org/10.1002/ejsp.2530
213 Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science (New York, N.Y.), 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559
214 Vraga, E. K., & Bode, L. (2020). Defining misinformation and understanding its bounded nature: Using expertise and evidence for describing misinformation. Political Communication, 37(1), 136–144. https://doi.org/10.1080/10584609.2020.1716500
215 Walczyk, J. J., Harris, L. L., Duck, T. K., & Mulay, D. (2014, August). A social-cognitive framework for understanding serious lies: Activation-decision-construction-action theory. New Ideas in Psychology, 34, 22–36. https://doi.org/10.1016/j.newideapsych.2014.03.001
216 Waldrop, M. M. (2017). The genuine problem of fake news: Intentionally deceptive news has co-opted social media to go viral and influence millions. Science and technology can suggest why and how. But can they offer solutions? Proceedings of the National Academy of Sciences of the United States of America, 114(48), 12631–12634. https://doi.org/10.1073/pnas.1719005114
217 Wang, R., He, Y., Xu, J., & Zhang, H. (2020). Fake news or bad news? Toward an emotion-driven cognitive dissonance model of misinformation diffusion. Asian Journal of Communication, 30(5), 317–342. https://doi.org/10.1080/01292986.2020.1811737
218 Ward, C., & Voas, D. (2011). The emergence of conspirituality. Journal of Contemporary Religion, 26(1), 103–121. https://doi.org/10.1080/13537903.2011.539846
219 Wardle, C., & Derakhshan, H. (2018). In Journalism, ‘fake news’ and disinformation (Handbook for journalism education and training). UNESCO. https://bit.ly/2MuELY5
220 Weiss, A. P., Alwan, A., Garcia, E. P., & Garcia, J. (2020). Surveying fake news: Assessing university faculty’s fragmented definition of fake news and its impact on teaching critical thinking. International Journal of Educational Integrity, 16(1), 1–30. https://doi.org/10.1007/s40979-019-0049-x
221 Wood, M. J. (2017). Conspiracy suspicions as a proxy for beliefs in conspiracy theories: Implications for theory and measurement. British Journal of Psychology, 108(3), 507–527. https://doi.org/10.1111/bjop.12231
222 Wood, M. J., & Douglas, K. M. (2015, June). Online communication as a window to conspiracist worldviews. Frontiers in Psychology, 6. https://doi.org/10.3389%2Ffpsyg.2015.00836
223 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646–1664. https://doi.org/10.1080/1369118X.2019.1621921
224 Xian, J., Yang, D., Pan, L., Wang, W., & Wang, Z. (2019). Misinformation spreading on correlated multiplex networks. Chaos, 29(11), 113123–1–10. https://doi.org/10.1063/1.5121394
225 Xiao, B., & Benbasat, I. (2011). Product-related deception in e-commerce: A theoretical perspective. MIS Quarterly, 35(1), 169–196. https://doi.org/10.2307/23043494
226 Zannettou, S., Caulfield, T., Blackburn, J., De Cristofaro, E., Sirivianos, M., Sringhini, G., & Suarez-Tangil, G. (2018). On the origins of memes by means of fringe web communities. Proceedings of the Internet Measurement Conference 2018, 188–202. https://doi.org/10.1145/3278532.3278550
227 Zimmer, F., Scheibe, K., Stock, M., & Stock, W. G. (2019). Fake news in social media: Bad algorithms or biased users? Journal of Information Science Theory & Practice (JISTaP), 7(2), 40–53. https://doi.org/10.1633/JISTaP.2019.7.2.4
228 Zollo, F. (2019). Dealing with digital misinformation: A polarised context of narratives and tribes. EFSA Journal, 17(S1), 1–15. https://doi.org/10.2903/j.efsa.2019.e170720
229 Zollo, F., Novak, P. K., Del Vicario, M., Bessi, A., Mozetič, I., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Emotional dynamics in the age of misinformation. PLoS ONE, 10(10), 1–22. https://doi.org/10.1371/journal.pone.0138740