Katana VentraIP

Misinformation

Misinformation is incorrect or misleading information.[1][2] Misinformation can exist without specific malicious intent; disinformation is distinct in that it is deliberately deceptive and propagated.[3][4][5] Misinformation can include inaccurate, incomplete, misleading, or false information as well as selective or half-truths.[6][7]

Not to be confused with Disinformation or Misinformation effect.

Much research on how to correct misinformation has focused on fact-checking.[8] However, this can be challenging because the information deficit model does not necessarily apply well to beliefs in misinformation.[9][10] Various researchers have also investigated what makes people susceptible to misinformation.[10] People may be more prone to believe misinformation because they are emotionally connected to what they are listening to or are reading. Social media has made information readily available to society at anytime, and it connects vast groups of people along with their information at one time.[11] Advances in technology have impacted the way people communicate information and the way misinformation is spread.[8] Misinformation can influence people's beliefs about communities, politics, medicine, and more.[11][12] The term also has the potential to be used to obfuscate legitimate speech and warp political discourses.


The term came into wider recognition during the mid-1990s through the early 2020s, when its effects on public ideological influence began to be investigated. However, misinformation campaigns have existed for hundreds of years.[13][14]

Causes[edit]

Factors that contribute to beliefs in misinformation are an ongoing subject of study.[35] According to Scheufele and Krause, misinformation belief has roots at the individual, group and societal levels.[36] At the individual level, individuals have varying levels of skill in recognizing mis- or dis-information and may be predisposed to certain misinformation beliefs due to other personal beliefs, motivations, or emotions.[36] At the group level, in-group bias and a tendency to associate with like-minded or similar people can produce echo chambers and information silos that can create and reinforce misinformation beliefs.[36][37] At the societal level, public figures like politicians and celebrities can disproportionately influence public opinions, as can mass media outlets.[38] In addition, societal trends like political polarization, economic inequalities, declining trust in science, and changing perceptions of authority contribute to the impact of misinformation.[36]


Historically, people have relied on journalists and other information professionals to relay facts.[39] As the number and variety of information sources has increased, it has become more challenging for the general public to assess their credibility.[40] Sources of misinformation can appear highly convincing and similar to trusted legitimate sources.[41] For example, misinformation cited with hyperlinks has been found to increase readers' trust. Trust is even higher when these hyperlinks are to scientific journals, and higher still when readers do not click on the sources to investigate for themselves.[42][43] Research has also shown that the presence of relevant images alongside incorrect statements increases both their believability and shareability, even if the images do not actually provide evidence for the statements.[44][45] For example, a false statement about macadamia nuts accompanied by an image of a bowl of macadamia nuts tends to be rated as more believable than the same statement without an image.[44]


The translation of scientific research into popular reporting can also lead to confusion if it flattens nuance, sensationalizes the findings, or places too much emphasis on weaker levels of evidence. For instance, researchers have found that newspapers are more likely than scientific journals to cover observational studies and studies with weaker methodologies.[46] Dramatic headlines may gain readers' attention, but they do not always accurately reflect scientific findings.[47]

Identification[edit]

Research has yielded a number of strategies that can be employed to identify misinformation, many of which share common features. According to Anne Mintz, editor of Web of Deception: Misinformation on the Internet, one of the simplest ways to determine whether information is factual is to use common sense.[48] Mintz advises that the reader check whether the information makes sense and whether the source or sharers of the information might be biased or have an agenda. However, because emotions and preconceptions heavily impact belief, this is not always a reliable strategy.[10] It can be difficult to undo the effects of misinformation once individuals believe it to be true.[49] Individuals may desire to reach a certain conclusion, causing them to accept information that supports that conclusion, and are more likely to retain and share information if it emotionally resonates with them.[50]


The SIFT Method, also called the Four Moves, is one commonly taught method of distinguishing between reliable and unreliable information.[51] This method instructs readers to first Stop and begin to ask themselves about what they are reading or viewing - do they know the source and if it is reliable? Second, readers should Investigate the source. What is the source's relevant expertise and do they have an agenda? Third, a reader should Find better coverage and look for reliable coverage on the claim at hand to understand if there is a consensus around the issue. Finally, a reader should Trace claims, quotes, or media to their original context: has important information been omitted, or is the original source questionable?


Visual misinformation presents particular challenges, but there are some effective strategies for identification.[52] Misleading graphs and charts can be identified through careful examination of the data presentation; for example, truncated axes or poor color choices can cause confusion.[53] Reverse image searching can reveal whether images have been taken out of their original context.[54] There are currently some somewhat reliable ways to identify AI-generated imagery,[55][56] but it is likely that this will become more difficult to identify as the technology advances.[57][58]


A person's formal education level and media literacy do correlate with their ability to recognize misinformation.[59][60] People who are familiar with a topic, the processes of researching and presenting information, or have critical evaluation skills are more likely to correctly identify misinformation. However, these are not always direct relationships. Higher overall literacy does not always lead to improved ability to detect misinformation[61] Context clues can also significantly impact people's ability to detect misinformation.[62]


Martin Libicki, author of Conquest In Cyberspace: National Security and Information Warfare,[63] notes that readers should aim to be skeptical but not cynical. Readers should not be gullible, believing everything they read without question, but also should not be paranoid that everything they see or read is false. The Liar's Dividend describes a situation in which individuals are so concerned about realistic misinformation (in particular, deepfakes) that they begin to mistrust real content, particularly if someone claims that it is false.[64] For instance, a politician could benefit from claiming that a real video of them doing something embarrassing was actually AI-generated or altered, leading followers to mistrust something that was actually real. On a larger scale this problem can lead to erosion in the public's trust of generally reliable information sources.[64]

Automated detection systems (e.g. to flag or add context and resources to content)

Emerging anti-misinformation sector (e.g. organizations combating scientific misinformation)

Provenance enhancing technology (i.e. better enabling people to determine the veracity of a claim, image, or video)

APIs for research (i.e. for usage to detect, understand, and counter misinformation)

Active bystanders (e.g. corrective commenting)

Community moderation (usually of unpaid and untrained, often independent, volunteers)

Anti-virals (e.g. limiting the number of times a message can be forwarded in privacy-respecting encrypted chats)

(examples being Wikipedia where multiple editors refine encyclopedic articles, and question-and-answer sites where outputs are also evaluated by others similar to peer-review)

Collective intelligence

Trustworthy institutions and data

Media literacy

Estonian

the profusion of misinformation sources makes the reader's task of weighing the reliability of information more challenging

[127]

social media's propensity for embeds misinformation with identity-based conflict[9]

culture wars

the proliferation of form an epistemic environment in which participants encounter beliefs and opinions that coincide with their own,[128] moving the entire group toward more extreme positions.[128][9]

echo chambers

Mass media, trust, and transparency[edit]

Competition in news and media[edit]

Because news organizations and websites compete for viewers, there is a need for efficiency in releasing stories to the public. The news media landscape in the 1970s offered American consumers access to a limited, but often consistent selection of news offerings, whereas today consumers are confronted with an abundance of voices online. This growth of consumer choice when it comes to news media allows the consumer to choose a news source that may align with their biases, which consequently increases the likelihood that they are misinformed.[69] 47% of Americans reported social media as their main news source in 2017 as opposed to traditional news sources.[159] News media companies often broadcast stories 24 hours a day, and break the latest news in hopes of taking audience share from their competitors. News can also be produced at a pace that does not always allow for fact-checking, or for all of the facts to be collected or released to the media at one time, letting readers or viewers insert their own opinions, and possibly leading to the spread of misinformation.[160]

Inaccurate information from media sources[edit]

A Gallup poll made public in 2016 found that only 32% of Americans trust the mass media "to report the news fully, accurately and fairly", the lowest number in the history of that poll.[161] An example of bad information from media sources that led to the spread of misinformation occurred in November 2005, when Chris Hansen on Dateline NBC claimed that law enforcement officials estimate 50,000 predators are online at any moment. Afterward, the U.S. attorney general at the time, Alberto Gonzales, repeated the claim. However, the number that Hansen used in his reporting had no backing. Hansen said he received the information from Dateline expert Ken Lanning, but Lanning admitted that he made up the number 50,000 because there was no solid data on the number. According to Lanning, he used 50,000 because it sounds like a real number, not too big and not too small, and referred to it as a "Goldilocks number". Reporter Carl Bialik says that the number 50,000 is used often in the media to estimate numbers when reporters are unsure of the exact data.[162]


The Novelty Hypothesis, which was created by Soroush Vosoughi, Deb Roy and Sinan Aral when they wanted to learn more about what attracts people to false news. What they discovered was that people are connected through emotion. In their study, they compared false tweets on Twitter that were shared by the total content tweeted, they specifically looked at the users and both the false and true information they shared. They learned that people are connected through their emotions, false rumors suggested more surprise and disgust which got people hooked and that the true rumors attracted more sadness, joy and trust. This study showed which emotions are more likely to cause the spread of false news.[11]

Machado, Caio; Kira, Beatriz; Narayanan, Vidya; Kollanyi, Bence; Howard, Philip (2019). "A Study of Misinformation in WhatsApp groups with a focus on the Brazilian Presidential Elections". Companion Proceedings of the 2019 World Wide Web Conference. pp. 1013–1019. :10.1145/3308560.3316738. ISBN 978-1-4503-6675-5. S2CID 153314118.

doi

Allcott, H.; Gentzkow, M. (2017). . Journal of Economic Perspectives. 31 (2): 211–236. doi:10.1257/jep.31.2.211. S2CID 32730475.

"Social Media and Fake News in the 2016 Election"

Baillargeon, Normand (4 January 2008). A short course in intellectual self-defense. Seven Stories Press.  978-1-58322-765-7. Retrieved 22 June 2011.

ISBN

Bakir, Vian; McStay, Andrew (7 February 2018). . Digital Journalism. 6 (2): 154–175. doi:10.1080/21670811.2017.1345645. S2CID 157153522.

"Fake News and The Economy of Emotions: Problems, causes, solutions"

and Victor Navasky, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, Pantheon Books, 1984.

Christopher Cerf

Cook, John; Stephan Lewandowsky; Ullrich K. H. Ecker (2017-05-05). . PLOS One. 12 (5): e0175799. Bibcode:2017PLoSO..1275799C. doi:10.1371/journal.pone.0175799. PMC 5419564. PMID 28475576.

"Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence"

Helfand, David J., A Survival Guide to the Misinformation Age: Scientific Habits of Mind. Columbia University Press, 2016.  978-0231541022

ISBN

Christopher Murphy (2005). Competitive Intelligence: Gathering, Analysing And Putting It to Work. Gower Publishing, Ltd.. pp. 186–189.  0-566-08537-2. A case study of misinformation arising from simple error

ISBN

O'Connor, Cailin; Weatherall, James Owen (1 September 2019). . Scientific American.

"How Misinformation Spreads—and Why We Trust It"

O'Connor, Cailin, and James Owen Weatherall, The Misinformation Age; How False Beliefs Spread. Yale University Press, 2019.  978-0300241006

ISBN

Offit, Paul (2019). Bad Advice: Or Why Celebrities, Politicians, and Activists Aren't Your Best Source of Health Information. Columbia University Press.  978-0-231-18699-5.

ISBN

Persily, Nathaniel, and Joshua A. Tucker, eds. Social Media and Democracy: The State of the Field and Prospects for Reform. Cambridge University Press, 2020.  978-1108858779

ISBN

Jürg Strässler (1982). Idioms in English: A Pragmatic Analysis. Gunter Narr Verlag. pp. 43–44.  3-87808-971-6.

ISBN

West, Jevin D.; Bergstrom, Carl T. (2021). . Proceedings of the National Academy of Sciences. 118 (15). Bibcode:2021PNAS..11812444W. doi:10.1073/pnas.1912444117. PMC 8054004. PMID 33837146.

"Misinformation in and about science"

Connie Hanzhang Jin; Miles Parks (April 20, 2020). (audio tutorial, graphic tutorial). NPR.

"Comic: Fake News Can Be Deadly. Here's How To Spot It"

(free online class). Management and Strategy Institute. 23 August 2022.

"Free Misinformation and Disinformation Training online"