Disinformation in the media is an increasingly large concern for the stability of nations around the globe. False information intentionally reported by news outlets or the spread of otherwise deceptive information can manipulate public opinion and hurt a nation’s stability. In recent years, organized propagandists and harmful foreign actors have released “imposter content” that has successfully influenced elections, shifted public opinion, and incited riots. While international concern grows, coordinated action by the global community has not kept up with the evolving tactics used by organized disinformation networks. However, a thorough understanding of key disinformation techniques can empower individuals to be internet-smart users and stop the damage before it is done.
Disinformation vs. Misinformation
Disinformation and misinformation are two slightly different forces working within information spaces. Misinformation is misleading or inaccurate information that people might share because they believe it to be true, whereas disinformation is intentionally deceptive; in other words, it is wrong on purpose.
Disinformation Tactic #1: Anti-communications Strategy
Harmful actors use anti-communications strategies to propel disinformation on social media platforms, newspapers, radio, and messaging platforms.
- Anti-communications is a messaging strategy that does not convey truthful and accurate information. It aims to cause chaos in information spaces to confuse users and encourage them to disengage from discussing or acting on certain issues. This tactic hurts open dialogue and free speech on online platforms and deliberative spaces.
Disinformation Tactic #2: Astroturfing
Astroturfing is one of the most prevalent disinformation tactics used in social media and online spaces. This tactic is intended to sway public opinion by providing a false perception of user engagement or beliefs. Organized actors will sometimes employ bots or exploit algorithms to give extremist views more traction. This process is done by artificially inflating the likes, interactions, or presence of divisive and extreme fringe opinions to facilitate greater traffic on social media sites.

- A Facebook post highlighting an extreme belief that few agree with in actuality may have 10,000 likes and hundreds of comments. In these cases, it is possible that the user engagement on this post may not have been the product of human users but bots or organized propagandists who are incentivized to push these divisive beliefs to the forefront of information spaces.
- The bigger consequence of this bot-manufactured consensus is that it may alienate real-life human users who disagree with the post and may sway them into believing that they are in the minority or losing position on a particular issue. Astroturfing may also include a high volume of these divisive or incendiary posts that are intended to drown out honest debate from actual human users.
Misinformation Tactic #1: Content Farm
Similar to Astroturfing, content farming exploits the algorithms of social media or online search engines to bring certain content to the forefront of online information spaces. Algorithms prioritize particular key words, phrases or statements and proceed to pinpoint articles, websites, and posts that satisfy that criteria, promoting more internet traffic. Organized misinformation networks may create posts, articles, or blogs that contain little reputable content as they only intend to fulfill the algorithmic requirements for gaining internet traffic.
Misinformation Tactic #2: Sleeper Effect
Media and stories are all around us. We are often unconsciously consuming and processing stories and narratives that we hear in passing or see in an article that we may only briefly glance at. Sleeper effect is a psychological phenomenon that aids in propelling disinformation into internet spaces when we remember a narrative or story, but not the origin of the information and therefore we are unable to assess the reliability of our source.
This may contribute to misinformation in internet spaces as we characterize flashy stories and narratives stuck in our memory as facts and promote them on the internet. Unsubstantiated stories are often more appealing than those grounded in fact; however, we must be vigilant in ensuring that the assumptions and knowledge that we promote on the internet can be traced back to a reputable source.
Misinformation Tactic #3: Clickbait
Clickbait is one of the most well known misinformation tools, yet many users are not fully aware of the serious implications bred by clickbait within information spaces. Clickbait is defined as the deceptive labeling of a title, text or hyperlink to promote increased traffic to a site or platform. A misleading or inaccurate article title may entice a user to read the article from an unreliable source despite the lack of evidence to justify the title or hyperlink. The consequence of clickbait is that search engine algorithms give priority to websites, articles, and platforms with the most amount of clicks and user engagement even if that source is peddling disreputable news. Users who fall for clickbait schemes may be inadvertently bringing fake news to the forefront of information spaces.

To learn more about disinformation and media literacy, visit the YALIChecks page.
1. University of Washington Bothell & Cascadia College Campus Library. (2022, September 15). News: Fake news, Misinformation & Disinformation. Library Guides. Retrieved September 20, 2022, from https://guides.lib.uw.edu/c.php?g=345925&p=7772376 2. Foreign Service Institute Course on Disinformation. (2022, August). Lecture, Washington, D.C. 3. Monmouth University Guggenheim Memorial Library. (2022, August 7). Media Literacy & Misinformation: How Misinformation spreads. LibGuides. Retrieved September 20, 2022, from https://guides.monmouth.edu/media_literacy/how_fake_news_spreads 4. Kumkale GT, Albarracín D. The sleeper effect in persuasion: a meta-analytic review. Psychol Bull. 2004 Jan;130(1):143-72. doi: 10.1037/0033-2909.130.1.143. PMID: 14717653; PMCID: PMC3100161.