Skip to Main Content

Building Resilience to Misinformation: An Instructional Toolkit

A toolkit to assist teaching faculty in engaging students on the topic of misinformation or disinformation.

Glossary

Actors: A thing which or person who performs or takes part in an action; a doer, an agent.

Astroturf: Astroturf, or Astroturfing, is a subtype of disinformation defined as a communicative strategy that utilizes websites, sock puppets, or bots on social media platforms to create the false impression that a particular opinion has widespread public support, when in fact this may not be the case (Zerback et al., 2021, p. 1080-1081).

Attitudes: A feeling or emotion felt towards something or someone.

Audience: Any individuals or groups “who are actually reached by particular media content or media ‘channels’” (McQuail & Deuze, 2020, p. 587); the recipient(s) of media messages, information, or communication(s).

Authority: A power to influence or command thought, opinion, or behaviour; a convincing force; a person(s) in command (Merriam-Webster).

Behaviours: An action taken or habit displayed by an individual. The way someone conducts oneself or behaves in response to their surrounding environment (Merriam-Webster, n.d.).

Beliefs: Something that is accepted, considered true, or held as an opinion (Merriam-Webster, n.d.). A fact, state, or phenomenon that an individual considers to be true. 

Bias: Bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, or a belief.

Bots: Bots are social media accounts that are operated entirely by computer programs and are designed to generate posts and/or engage with content on a particular platform. In disinformation campaigns, bots can be used to draw attention to misleading narratives, to hijack platforms’ trending lists, and to create the illusion of public discussion and support (Howard & Bence, 2016).

Cherry Picking: Selecting a set of "data that appear[s] to confirm one position while ignoring other data that contradicts that position"​ (Cook, 2020)

Cognitive Dissonance: Avoiding or misperceiving any incoming information or messages that challenge settled or pre-existing opinions and beliefs, thereby limiting the likelihood of an individual changing their pre-existing opinions, beliefs, or views (McQuail & Deuze, 2020).

Communication: A process of sharing between individuals and/or groups based on the idea of sending and receiving messages (McQuail & Deuze, 2020).

Confirmation Bias: Seeking and/or interpreting evidence in ways that are partial to any existing beliefs, expectations, or a hypothesis (Nickerson, 1998, p. 175); Accepting something that agree with your world view (regardless of if it is true or fake) and rejecting all other evidence (even if it is true) if it contradicts your worldview (Agarwal & Alseedi, 2021, p. 643); A tendency and/or inclination to process information in a way that confirms our pre-existing beliefs (Reed et al., 2019, p. 217).

Conspiracy Theory: It will typically refer to any “claims of conspiracy which are less plausible than alternative explanations, contradict the general consensus among epistemic authorities, are predicated on weak evidence, postulate unusually sinister and competent conspirators, and are ultimately unfalsifiable” (Brotherton & Eser, 2015, p. 1; see also Brotherton, 2013). People that believe one conspiracy theory are more likely to believe others (Been & Greer, 2021, p. 9).

Context: The situation within which something or exists. An environmental setting (Merriam-Webster, n.d.).

Credibility: Something or someone that is believable and trusted.

Credibility Importance: Measures the importance an individual ascribes to trusting credible sources and scientific evidence (Nygren & Guath, 2021, p. 6).

Critical Thinking: Considered “the art of analyzing and evaluating thinking with a 
view to improving it" (Paul & Elder, 2014, p.2).

Data Mining: Data mining is the process of monitoring large volumes of data by combining tools from statistics and artificial intelligence to recognize useful patterns. Through collecting information about an individual’s activity, disinformation agents have a mechanism by which they can target users on the basis of their posts, likes, and browsing history (Ghosh & Scott, 2018).

Deep Fakes: Deepfake(s) is the term currently being used to describe fabricated media produced using artificial intelligence. By synthesizing different elements of existing video or audio files, AI enables relatively easy methods for creating ‘new’ content, in which individuals appear to speak words and perform actions, which are not based on reality. Although ‘deepfakes’ are still in their infancy, it is likely we will see the term ‘deepfakes’ used more frequently in disinformation campaigns, as these techniques become more sophisticated (Li et al., 2018).

Disinformation: Disinformation is false information that is deliberately created or disseminated with the express purpose to cause harm. Producers of disinformation typically have political, financial, psychological, or social motivations.

Echo Chamber: A space, whether tangible or online, wherein individuals are primarily exposed to confirming opinions (Flaxman et al., 2016). 

Expert: Having, involving, or displaying special skill or knowledge derived from training or experience (Merriam-Webster, n.d.).

Fabrication: The act of making up or creating something for the purposes of deception (Merriam-Webster, n.d.).

Fact-checking: Fact-checking (in the context of information disorder) is the process of determining the truthfulness and accuracy of official, published information such as politicians’ statements and news items.

False Information: Inaccurate or incorrect information.

Fake Experts: In the FLICC Techniques resource, John Cook defines this as "presenting an unqualified person or institution as a source of credible information" (2020)

Fake Followers: Fake followers are anonymous, or imposter social media accounts created to portray false impressions of popularity about another account. Social media users can pay for fake followers as well as fake likes, views, and shares to give the appearance of a larger audience. 

Fake News: News that appropriates the look and feel of ‘real’ news and presents itself as ‘real’ news (Tandoc Jr. et al., 2018).

Filter Bubbles: A phenomenon within which “algorithms inadvertently amplify ideological segregation by automatically recommending content an individual is likely to agree with (Flaxman et al., 2016, p. 299).

Framing Effect: The impact of emphasising certain aspects of information over others.

Impossible Expectations: A situation wherein actors or individuals are "demanding unrealistic standards of certainty before acting on the science" (Cook, 2020)

Information: The content, or messages, of all meaningful communication (McQuail & Deuze, 2020).

Internalized Systems: The internalized systems (attitudes, beliefs and values) give rise to our behaviours. The internalized systems affect how people behave differently in various circumstances and how they act upon receiving information.

Lateral Reading: The act of using additional resources to doublecheck information. Digital resources are particularly important for this (Nygren & Guath, 2021, p. 2).

Logical Fallacy: An error in reasoning; the conclusion for an argument "doesn't logically flow from the premises" (Cook, 2020)

Long-term Memory: A vast store of knowledge and record of prior events stored for substantial periods of time (Cowan, 2008).

Malinformation: Malinformation is genuine information that is shared to cause harm.This includes private or revealing information that is spread to harm a person or reputation.

Manufactured Amplification: Manufactured Amplification occurs when the reach or spread of information is boosted through artificial means. This includes human and automated manipulation of search engine results and trending lists, and the promotion of certain links or hashtags on social media (Wardle & Derakshan, 2017).

Misinformation: Misinformation is information that is false, but not intended to cause harm. For example, individuals who don’t know a piece of information is false may spread it on social media in an attempt to be helpful.

News: Encompasses “the main form in which current information about public events is carried by media of all kinds” (McQuail & Deuze, 2020), whether online, on television, or in print; An accurate account of real events supposedly based on the truth (Tandoc Jr. et al., 2018).

Parody: Similar to satire but differs in how they use "non-factual information to inject humor" (Tandoc Jr. et al., 2018, p. 142)

Peer-review: The process of scholars critically appraising each other’s work to ensure a high level of scholarship and credibility in an academic journal, and to improve both the quality and readability of a given article (University of Toronto, n.d.).

Primary Source: Original documents created by witnesses or observers who experienced the events or conditions being documented. Examples include newspaper articles, government documents, photographs, archives, maps, etc. (University of Calgary Libraries and Cultural Resources, 2018).

Propaganda: Propaganda is true or false information spread to persuade an audience, but often has a political connotation and is often connected to information produced by governments. It is worth noting that the lines between advertising, publicity, and propaganda are often unclear (Jack, 2017).

Pseudoscience: A system or set of theories, assumptions, and methods that is incorrectly regarded as scientific (Merriam-Webster, n.d.).

Receiver: The actor, individual, or group that is consuming, engaging with, or receiving a piece of information or media that they did not directly create. The number of receivers can range from one to many. 

Satire: In a fake news context this typically refers "to mock news programs, which typically use humor or exaggeration to present audiences with news updates" (Tandoc Jr. et al., 2018, p. 141).

Scholarly Source: A scholarly source is written by academics or other experts and contributes to knowledge in a particular field by sharing various new research findings, theories, analysis, insights, news, or summaries of other current knowledge. These sources can be either primary or secondary research (University of Toronto Libraries, n.d.).

Short-term Memory: Information processed in a short period of time. This information is likely to remain for a shorter duration than other information (Camina & Güell, 2017).

Secondary Source: A document, such as a book or a journal article, that will comment on or interpret primary sources and use them to support their arguments (University of Calgary Libraries and Cultural Resources, 2018).

Selective Memory: Remembering or recalling only specific events, moments, or memories of the past while omitting and leaving out others (Abel & Bäuml, 2015). This ‘selective’ remembering is often associated with omitting threatening memories or information (Saunders, 2013).

Sender: The creator of a piece of information, media, or related phenomenon who communicates it to others. Can be an actor, individual, or group.

Troll Farm: An organized and institutionalized group of people who deliberately manipulate and fabricate fact within the online environment. Most often troll farms are created to influence political decisions.

Trolling: Trolling is the act of deliberately posting offensive or inflammatory content to an online community with the intent of provoking readers or disrupting conversation. Today, the term “troll” is most often used to refer to any person harassing or insulting others online. 

Truth Decay: Truth Decay is the diminishing role of facts and analysis in both the political and public arenas. Truth Decay is characterized by four trends (Kavanagh & Rich, 2018): 

  1. Increasing disagreement about facts 

  1. A blurring of the line between opinion and fact 

  1. The increasing relative volume and resulting influence of opinion over fact 

  1. Declining trust in formerly respected sources of facts 

Values: The relative worth, utility, or importance associated in relation to a feeling, emotion, or object. Can be held individually or shared amongst a group (Merriam-Webster, n.d.).

Verification: Verification is the process of determining the authenticity of information.

Viral Disinformation: Disinformation that is circulated widely and rapidly, typically in an online setting via social networks (Baptista & Gradim, 2020).

Worldviews: A comprehensive conception or apprehension of the world, generally from a particular standpoint (Merriam-Webster, n.d.).

References

Abel, M., & Bäuml, K. H. T. (2015). Selective memory retrieval in social groups: When silence is golden and when it is not. Cognition, 140, 40-48. 

Agarwal, N. K., & Alsaeedi, F. (2021). Creation, dissemination, and mitigation: Toward a disinformation behaviour framework and model. Aslib Journal of Information Management, 73(5), 639-658. 

Beene, S., & Greer, K. (2021). A call to action for librarians: Countering conspiracy theories in the age of QAnon. The Journal of Academic Librarianship, 47(1), 1-42. 

Brotherton, R. (2013). Towards a definition of “conspiracy theory”. The British Psychology Society’s Quarterly Magazine Special Issue: The Psychology of Conspiracy Theories, 88, 56. 

Brotherton, R., & Elser, S. (2015). Bored to fears: Boredom proneness, paranoia, and conspiracy theorists. Personality and Individual Differences, 80, 1-5. 

Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter Bubbles, Echo Chambers, and Online News Consumption. Public Opinion Quarterly, 80(1), 298-320. 

Ghosh, D., & Scott, B. (January 2018). DigitalDeceit: The Technologies Behind Precision Propaganda on the Internet. New America. https://www.newamerica.org/pit/policy-papers/digitaldeceit/.  

Howard, P. N., & Kollanyi, B. (2016). Bots, #StrongerIn, and #Brexit: Computational Propaganda during the UK-EU Referendum. COMPROP Research. http://comprop.oii.ox.ac.uk/wp-

Jack, C. (2017). Lexicon of Lies. Data & Societyhttps://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf  

Kavanagh, J., & Rich, M. D. (2018). Truth decay : an initial exploration of the diminishing role of facts and analysis in American public life. RAND Corporation. https://ucalgary.primo.exlibrisgroup.com/permalink/01UCALG_INST/46l39d/alma99102828899990433  

Li, Y., Chang, M.C., & Lyu, S. (June 11, 2018). In Ictu Oculi: Exposing AI Generated Fake Face Videos by Detecting Eye Blinking. https://doi.org/10.48550/arxiv.1806.02877.

McNutt, J., & Boland, K. (2007). Astroturf, Technology and the Future of Community Mobilization: Implications for Nonprofit Theory. Journal of Sociology and Social Welfare, 34(3), 165-178. 

McQuail, D., & Deuze, M. (2020). McQuail’s Media and Mass Communicatino Theory: Seventh Edition. Sage Publications. 

Merriam-Webster. (n.d.). Authority. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/authority

Merriam-Webster. (n.d.). Behaviour. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/behaviours

Merriam-Webster. (n.d.). Beliefs. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/beliefs. 

Merriam-Webster. (n.d.). Context. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/context.  

Merriam-Webster. (n.d.). Expert. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/expert.  

Merriam-Webster. (n.d.). Fabrication. In Merriam-Webster.com Dictionary. Retrieved December 4, 2023, from https://www.merriam-webster.com/dictionary/fabricate

Merriam-Webster. (n.d.). Pseudoscience. In Merriam-Webster.com Dictionary. Retrieved June 18, 2023, from https://www.merriam-webster.com/dictionary/pseudoscience.  

Merriam-Webster. (n.d.). Values. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/values.  

Merriam-Webster. (n.d.). Worldview. In Merriam-Webster.com Dictionary. Retrieved January 16, 2023, from https://www.merriam-webster.com/dictionary/worldview.  

Nickerson, R. S. (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology, 2(2), 175-220. 

Nygren, T., & Guath, M. (2021). Students Evaluating and Corroborating Digital News. Scandinavian Journal of Educational Research, 1-17. 

Oxford English Dictionary (OED). (n.d.). Actor. In OED: Oxford English Dictionary. Retrieved January 1, 2023, from https://www.oed.com.  

Paul, R., & Elder, L. (2014). Critical Thinking: Concepts and Tools. Foundation for Critical Thinking. Retrieved from https://www.ubiquityuniversity.org/wp-content/uploads/2018/02/Concepts-_-Critical-Thinking-HandbookTools-Ubiquity-University.pdf.  

Reed, K., Hiles, S. S., & Tipton, P. (2019). Sense and Nonsense: Teaching Journalism and Science Students to Be Advocates for Science and Information Literacy. Journalism and Mass Communication Educator, 74(2), 212-226. 

Saunders, J. (2013). Selective memory bias for self-threatening memories in trait anxiety. Cognition and Emotion, 27(1), 21-36. 

Tandoc Jr., E. C., Lim, Z. W., & Ling, R. (2018). Defining “Fake News”: A typology of scholarly definitions. Digital Journalism, 6(2), 137-153. 

Wardle, C., & Derakshan, H. (September 27, 2017). Information Disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe, https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c. 

Welsh, M., & Begg, S. (2016). "What have we learned? Insights from a decade of bias research". The APPEA Journal, 56(1), 435. doi:10.1071/aj15032.

Zerback, T., Töpfl, F., & Knöpfle, M. (2021). The disconcerting potential of online disinformation: Persuasive effects of astroturfing comments and three strategies for inoculation against them. New Media and Society, 23(5), 1080-1098.