Disseminate – Counterpropaganda Principle #9

Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

  • Effective propaganda targets a particular audience – sharing it with another group reveals it as biased and targeted propaganda [1]
  • Revealing propaganda as false reduces its effectiveness across many target audiences [1]
  • Your targeted messages may be revealed to untargeted audiences and used against you.
  • The existence of targeted messages, even when their contents are unknown, can be used as both propaganda and counterpropaganda.

Four Illustrations of Dissemination



Trump and Putin: Conversation or Treason

When information intended for a particular audience is shared with other audiences, the principle of Dissemination is in operation. Whether it be pure propaganda or back-room dealings, it can be harmful to any audience intentionally excluded. Trump’s private conversations with Putin fall into the category of “back-room dealings” for several reasons:

  • Trump’s long history of business dealings with Putin and other Russians which have led to accusations of Trump’s money-laundering for the Russians
  • Trump’s frequent “cozying-up” to Putin
  • Trump’s preference for Putin’s evidence-free assertions over facts collected by our U.S. Intelligence agencies
  • The fact of Russian interference in American social media and the 2016 election
  • Trump’s long-held desire to build a “tower” in Moscow
  • Trump’s attacks on the American free press
  • Trump’s invention of a phony “Deep State” which he claims is out to get him
  • Trump’s refusal to share contents of his conversations with Putin with any members of his administration
  • All previous presidents have kept meticulous notes of meetings with potential and actual enemies and shared them with all appropriate members of their administration

The Mueller investigation has not yet revealed what evidence it has concerning collusion between the Russians and Trump before the November, 2016 election. We do know that the Russians trolled us, created social media bots and misinformation messages and hacked American email systems. They also did this in Great Britain prior to the 2016 British Brexit vote and in France before their May 2017 elections.

We know that Trump has met several times with Russian President Vladimir Putin, but we don’t know what they talked about. It could be football, Trump towers in Moscow, election hacking or treason. This may be an extreme example of finding a “message targeted for a particular audience” (Putin) and disseminating it to other audiences (the U.S. Congress and the American public). It could be extremely damaging to Trump if the contents of their talks are revealed. Collusion, possibly treason, are suspected. The stakes are high.

On Jan. 15, 2019, Peter Baker of the New York Times summarized their meetings: [2]

The first time [July 2017] they met was in Germany. President Trump took his interpreter’s notes afterward and ordered him not to disclose what he heard to anyone. Later that night, at a dinner, Mr. Trump pulled up a seat next to President Vladimir V. Putin to talk without any American witnesses at all. Their third encounter [November 2017] was in Vietnam when Mr. Trump seemed to take Mr. Putin’s word that he had not interfered in American elections. A formal summit meeting followed in Helsinki, Finland [July 2018], where the two leaders kicked out everyone but the interpreters. Most recently, they chatted in Buenos Aires [December 2018] after Mr. Trump said they would not meet because of Russian aggression. [2]

For a man whose claims of “No Collusion” with the Russians could number into the thousands – the one-time record may be the 23 repetitions on 12/28/17 to New York Times reporter Michael S. Schmidt [3] – Trump goes to great lengths to keep private his conversations with Putin. This looks like collusion then and collusion still. Congress, our intelligence services, members of the administration and most Americans want to know what was said. According to a ABC/Washington Post poll released January 2019, six in 10 Americans said they backed the Democratic inquiries into the content of these conversations. [4]

On the July 2017 meeting in Hamburg, Germany, Allegra Kirkland of TalkingPointsMemo.com writes:

According to reports, the interpreter told other administration officials that Putin denied Russia’s interference in the U.S. election and Trump replied, “I believe you.” U.S. officials received no formal readout. The [Washington] Post reported that Trump actually confiscated the notes of his own interpreter, and instructed the linguist present not to discuss the content of the conversation with other administration officials. [5]

About their brief informal November 2017 meeting in Vietnam, Kirkland comments:

On Air Force en route to Asia, Trump told reporters that he “expected” to meet with Putin since he wanted the Russian president’s “help on North Korea.” [5]

The July 2018 meeting in Helsinki, Finland included only Putin, his unidentified translator, Trump and his translator Marina Gross. The  Washington Post reported that Gross emerged with “pages of notes.” At the press conference after the meeting Trump stated:

“My people came to me, Dan Coats came to me and some others and said they think it’s Russia. I have President Putin. He just said it’s not Russia. I will say this. I don’t see any reason why it would be, but I really do want to see the server.” [6]

That’s a garbled statement to parse, but it seemed to say that Trump believed Putin over his own intelligence services, which alarmed people throughout Washington and across America. House Democrats demanded translator Gross’s notes, but were blocked by then-ruling Republicans on the House intelligence committee. The Democrats are now in control, and calls for the notes are resuming. [7]

Trump reversed his statement of belief the following day.

The president, making what he described as clarifying comments in a meeting with members of Congress at the White House Tuesday, said he meant to say that he had no reason to think Russia “wouldn’t” have interfered in the 2016 election, instead of what he actually said on Monday, which is that he had no reason to think Russia “would” have interfered….”The sentence should have been, I don’t see any reason why I wouldn’t, or why it wouldn’t be Russia. So just to repeat it, I said the word ‘would’ instead of ‘wouldn’t’ and this sentence should have been, and I thought I would maybe be a little bit unclear on the transcripts or unclear on the actual video, but the sentence should have been, I don’t see any reason why it wouldn’t be Russia. So sort of a double negative.” [8]

Uh huh. Right.

House Foreign Affairs Committee Chairman Eliot L. Engel (NY-D) and Rep. Adam B. Schiff (CA-D), chairman of the House Permanent Select Committee on Intelligence want to know what was said in these meetings. They are threatening to subpoena Trump’s translator, Marina Gross, along with whatever notes she still has in her possession. [7]

“I would prefer not to do that,” Mr. Engel told CNN last weekend of a subpoena for Ms. Gross. “We have to see what we can find out. We may have no choice.” [7]

Daniel Hoffman, a former CIA Moscow station chief, argues that lawmakers should instead seek testimony on the issue from Director of National Intelligence Daniel Coats and others before violating a presidential translator’s right to confidentiality in the most sensitive of conversations, calling it a “slippery slope” to a “Pandora’s box.” Testimony from Coats and Haspel could shed light on whether Trump’s advisers have been kept ignorant of the content of his discussions with Putin. [7]

However, it’s quite possible that our intelligence agencies already know what Trump and Putin said in their Helsinki conversations. The room was possibly bugged. [7]

According to Guy Taylor of  The Washington Times (1-17-19):

“It’s more than conceivable that Finnish intelligence had the room bugged, and they likely would have shared a transcript of what was said either directly with the CIA or with people accessible to U.S. intelligence officers,” said one of the sources, who spoke only on the condition of anonymity. [7]

Peter Baker of the New York Times reports:

“What’s disconcerting is the desire to hide information from your own team,” said Andrew S. Weiss, who was a Russia adviser to President Bill Clinton. “The fact that Trump didn’t want the State Department or members of the White House team to know what he was talking with Putin about suggests it was not about advancing our country’s national interest but something more problematic.” [2]

We’ll give the last word to Guy Taylor of the Washington Times:

Some say Mr. Trump’s desire to keep the Putin talks private spring from the harsh media criticism he got for the Helsinki press conference. “What he said in private to Mr. Putin likely wasn’t all that different from what he said publicly at the press conference,” said one intelligence source. “At the end of the day, he probably doesn’t want it exposed that he also sounded like an idiot during the closed-door meeting.” [7]

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.


Hillary Clinton’s Wall Street Speeches

This is another example of someone struggling to hide their statements targeting one audience from dissemination to other audiences. While we don’t yet know what Trump and Putin said to each other, we do know what Hillary Clinton told Wall Street investors. Here we see a case of potential counterpropaganda being turned effectively into powerful propaganda. The fact of Clinton’s attempt at secrecy was used against her with repeated questions such as, “What did she say? Why won’t she tell us?” What she did say didn’t really amount to much, as we’ll see below, and an undefensive openness on her part would have been a better stance.

Hillary Clinton was long lambasted by her political opponents, especially Donald Trump and Senator Bernie Sanders, for not releasing the texts of her speeches to Wall Street. She was accused of being too cozy with big bankers and financiers who were widely and angrily viewed as responsible for the 2007-08 crash. Following their leaving the White House in 2001, the Clintons had made more than $120 million in speeches to Wall Street and other special interests. Mrs. Clinton sometimes donated her usual $225,000 fee to their family foundation. [9]

Bernie Sanders: “I kind of think if you’re going to be paid $225,000 for a speech, it must be a fantastic speech, a brilliant speech which you would want to share with the American people.” [9]

What did she tell them that she doesn’t want the rest of us to know?” people wondered.

  • She dreamed of “open trade and open borders” throughout the Western Hemisphere. [9]
  • Citing Abraham Lincoln’s back-room deal-making, she speculated that one must have “both a public and a private position” on divisive issues. [9]
  • Their family’s increasing wealth was making her “kind of far removed” from the struggles of the middle class. (This followed her statement of empathy for the “anxiety and even anger in the country over the feeling that the game is rigged.”) [9]
  • “There is such a bias against people who have led successful and/or complicated lives.” She felt it was “very onerous and unnecessary” that public officials must sell or divest their assets before serving. [9]
  • It was an “oversimplification” to blame the 2008 global financial crisis on the U.S. banking system. [9]
  • The deficit-reduction proposal created by the leaders of President Obama’s fiscal commission, which suggested raising the Social Security retirement age, was “the right framework.” [9]
  • To Goldman Sachs in October 2013: “There’s nothing magic about regulations: too much is bad, too little is bad. How do you get to the golden key?” [10]

In light of events and allegations since then – porn stars, treason, pee pee tapes, election hacking, interference with official investigations, forced separation of families, and many other “highlights” of our Trumpian era – one wonders what was so bad about such comments that Hillary struggled to keep them secret. As William D. Cohan put it in his Vanity Fair article: [10]

“But while it may not have been wise to book a bunch of talks with financial elites in advance of a populist election season—no one has ever accused Clinton of profound political acuity—it seems like her greatest crime, at least according to these documents, is having a fairly nuanced understanding of capital markets. And, perhaps more notably, some disarming honesty.” [10]

For months during mid-2016, Julian Assange – who had long despised Hillary Clinton – had been promising to release documents before the November election which would bring results characterized as career-ending, guaranteed jail-time, and so forth for Ms. Clinton. Roger Stone, Trump’s self-styled political “trickster,” seemed to be all over it. Finally on October 7, 2016, Wikileaks released thousands of documents, stolen from the State Department, the Clinton campaign and John Podesta, Clinton’s campaign chairman. [11]

On March 19, 2016, John Podesta received an email which appeared to be from Google, informing him that his email account had just been hacked, advising him to immediately change his passwords and thoughtfully supplied him with a link to Google to make those changes. His chief-of-staff asked a help desk employee to look at it, and was advised that the email was valid and Podesta should make the changes. Which he did. The email was a “spear-phish” and hackers now had access to Podesta’s email account. [12]

It was later discovered that the Podesta hack was one of some 3,900 hacks on “targeted individuals in government, the military, people who worked for companies in military and government supply chains, journalists, people who worked for the DNC and members of Hillary Clinton’s campaign organization like Podesta” instigated by Fancy Bear, a Russian hacking group allied to or part of the Russian troll farm in St. Petersburg. Assange claimed that Russians were not the source of the hacked documents, and identifying whoever hacked it wasn’t important anyway. He was either lying, obfuscating, or wrong. [13]

Among the Podesta documents was a list of 25 excerpted comments from Clinton’s paid Wall Street speeches which a staffer had flagged as “politically problematic.” [14]

On October 7, 2016, a mere 30 minutes after the Access Hollywood tape was first published, WikiLeaks began publishing thousands of emails from Podesta’s Gmail account. Throughout October, WikiLeaks released installments of these emails on a daily basis. On December 18, 2016, John Podesta stated in Meet the Press that the FBI had contacted him about the leaked emails on October 9, 2016, but had not contacted him since. [15]

The timing of the release was ideal. It managed to immediately smother much of the ill effect that the Access Hollywood tape – the recording of Trump’s famous “pussy-grabbing” claim – would have had on Trump’s campaign and, like a timed-release poison pill, have sufficient time to boil and fester in the public mind throughout the month remaining until the November 8 election. This was a classic “October Surprise” operation.

As cited above, Clinton’s comments in the Wall Street speeches amount to far more in the minds of Clinton’s opponents and those Americans suitably prepared by forward-looking propaganda, than they amounted to in fact. “Nuanced understanding” and “disarming honesty” don’t amount to anything remotely evil, except in the prepared mind of the brainwashed.

Cold War, Hot Propaganda

The end of World War II brought us the Cold War. The concept was that a cold war was the alternative to a hot war. The Cold War was a war primarily of propaganda rather than shooting, at least as far as the Soviet-U.S. rivalry was concerned. [16]

During the Cold War, the Soviets used propaganda both to spread their own ideology and to undermine their enemies. In 1983 Soviet Communist Party Politburo member Konstantin Chernenko told a Central Committee meeting:

Comrades, our entire system of ideological work should operate as a well-arranged orchestra in which every instrument has a distinctive voice and leads its theme, while harmony is achieved by skillful conducting. The main demands on party leadership of ideological work are constantly to check the tone of propaganda against our policy goals and people’s interests, and to ensure that ‘word becomes deed,’ as Lenin put it. Propaganda is called upon to embrace every aspect of social life and every social group and region and to reach every individual. [17]

When able, the United States would share with European nations examples of Soviet disinformation targeting the Third World. The Europeans were then able to identify Soviet propaganda targeting them. [1]

In August 1986, Washington Post reporter John Goshko received a letter which United States Information Agency employee Herbert Romerstein had supposedly written to Senator David Durenberger, instructing the Senator on how the U.S. could utilize the Chernobyl nuclear energy power plant disaster as an effective propaganda campaign. Goshko showed the letter to Romerstein who determined it was a forgery, based on a previous letter Romerstein had written about a different topic. A copy of that earlier letter had been given to a Czech diplomat by Romerstein, specially marked so that if it reappeared later, its origin would be known. This Czech diplomat was a known Russian agent and the letter made its way from him to the KGB “safe house” in Washington, DC, whose personnel then used it to forge the new letter, leaving the letterhead and signature intact and inserting new text. Romerstein writes: “The FBI and other organizations in the Active Measures Working Group [AMWG] used the forgery as an example of KGB methods and we in fact got more mileage out of it than the Soviets ever could have.” [18]

Such Soviet disinformation became an issue between Soviet leader Mikhail Gorbachev and Secretary of State George Shultz. At a Moscow meeting, Gorbachev, waving a copy of an AMWG publication which the working group had distributed throughout the world, complained that the report was against the spirit of glasnost – the Soviet propaganda campaign that heralded Gorbachev’s “openness.” Shultz responded that when the KGB stops lying about us, we would stop exposing them. Later that year, at the Washington summit, Gorbachev told USIA Director Wick that it was time for “no more lies, no more disinformation.” Yet the Soviet forgeries and disinformation stories continued. [19]

Soviet Union disinformation declined as they approached their final collapse. When the Soviet Union fell and the Cold War ended, the AMWG ended as well. That other entities would arise to distribute anti-American propaganda which would need to be countered was not considered. But former “friends of the Soviet Union” went on to become friends with the Islamic extremists who shared their hatred for the United States and the West. Unfortunately for America, much of their counterpropaganda infrastructure had already been dismantled. [20]

Trump’s Tax Returns

Rating of “Broken” from Politifact.com [21]
In 1973, President Richard Nixon released his tax returns dating back to 1969, and every president since has followed suit. Until Donald Trump, that is. About this topic, Trump – famous for openly talking about the length of his fingers and other critically important topics – simply will not talk. [22]

White House senior adviser Kellyanne Conway said on Jan. 22, 2017 that Trump wouldn’t do it.

“The White House response is that he’s not going to release his tax returns,” she told ABC’s “This Week.” “We litigated this all through the election. People didn’t care.” [22]

And many thanks to Kellyanne Conway for informing us of what we don’t care about. Most of us did not know that we do not care.

Since then Trump has repeatedly dodged the question, saying he can’t release his tax returns until the audit of his returns is finished. This is nonsense on stilts. Anyone can release any of their own tax returns – audited, undergoing audit, pre-audit – whenever they want. Unaudited returns are subject to later amendment resulting from an audit, but everyone understands this. And it’s always informative to know what someone tried to get away with but was caught by an IRS audit and forced to correct. [23]

“Well, I told you, I will release them as soon as the audit. Look, I’ve been under audit almost for 15 years. I know a lot of wealthy people that have never been audited. I said, do you get audited? I get audited almost every year.” – Donald Trump at first presidential debate, Sep. 26, 2016 [24]

George Stephanopoulos on Good Morning America asked Trump why he released his tax returns (several late-70’s returns released in 1981) when applying for a casino license, but won’t do so when running for president. Trump revealingly answered, “Well, at the time it didn’t make any difference to me. Now it does.” Stephanopoulos pressed Trump for his tax rate, which is not shown on Trump’s financials. Trump replied: “It’s none of your business. You’ll see it when I release, but I fight very hard to pay as little tax as possible.” Trump didn’t want to show how little tax he pays; he may not pay any at all. He complains loudly about other rich people not paying taxes, but likely pays none himself. [23]

Rating of “False” from Politifact.com [25]
Why might Trump want to keep his tax returns secret? Several possibilities immediately come to mind:

  • Income far lower than he claims
  • Debts (and resulting interest expense) far higher than he claims
  • Charitable contributions far lower than he claims
  • Hidden income
  • Income resulting from debt forgiven by others
  • Fraudulent deductions
  • Numerous errors and outright frauds caught by the IRS, or perhaps uncaught until now
  • Sources of income he wouldn’t want anyone to know about (kickbacks, gambling, drugs, money laundering, prostitution)

Unfortunately for Trump and his children, the tax returns of his “Charitable Foundation” (I.D. #13-3404773) are public record and anyone can look at them. Many have. The Attorney General of New York, Barbara Underwood, looked, didn’t like what she saw, and sued. Reportedly, the charity is now required to distribute its remaining assets and fold, or has done so already. [25][26]

The gist of the Foundation’s problem is twofold. First, donations to it were solicited and received from other people, in violation of New York State law. Second, the Trumps used some of these funds for personal use: campaign donations, personal legal bills, a portrait of Trump, a Tim Tibow football helmet, and what appears to be bribes.

Trump Foundation 2015 Form 990-Pf, Statement #5 “Other Assets.
“Helmet” is the Tim Tebow football helmet for which the foundation paid $12,000 to Susan G. Komen breast cancer organization. Details on the Schantz fine art is unknown at this time. The Israel fine art is a 6-ft-tall portrait of Donald Trump, painted by Michael Israel, for which the foundation paid $20,000.

One such looks-like-a-bribe-to-us payment was the illegal $25,000 contribution to a political group connected to Florida Attorney General Pam Bondi, who at that time was deciding whether to pursue a fraud investigation into Trump University. For some reason, she then dropped the investigation. Charitable foundations aren’t allowed to donate to political groups, so the contribution was not only illegal, but the foundation’s tax return then claimed it had gone to a similarly-named legitimate nonprofit in Kansas which in fact received no money from the Trump Foundation. [25]

We looked at the Trump Foundation’s 2015 Form 990PF (Personal Foundation) return, received in Ogden, Utah on November 23, 2016.  Interesting reading. [If there is an amended, corrected or restated version on-line, we could not find it.] This return is about as simple a return as you can find – slightly more complicated than a 1040EZ – yet it contained a glaring “error.” It had only four deductions: Legal Fees $55, Accounting Fees $5,000, New York State Filing Fee $250, and “Other Nondeductible Charitable Contributions” of $41,636. Total Deductions for the return: $46,941. [27]

Note for Statement #4 item of $41,636 of “Other Nondeductible Charitable Contributions” with no further explanation.

You don’t have to be an accountant or linguist to suspect that “Nondeductible Charitable Contributions” are – ahem – nondeductible. Yet there they are, deducted as expense, totaling 88.6% of deductions. No further explanation given.

Our complete report on Trump Foundation 2015 Form 990PF.

Link to Trump Foundation: 2014 Form 990PF
What immediately jumps off the page of this return are the $477,400 contribution from “Richard Ebers Inside Sports and Entertainment Group,” and $50,000 given to “Columbia Grammar and Preparatory School” at 5 West 93rd St. in New York City, listed on pages 15 and 18. Perhaps the school is where one of Trump’s family attends classes? Is this an “arms-length” transaction? Is it just an attendance fee?

Link to Trump Foundation: 2016 Form 990PF
Two big page-jumpers are the “Reimbursement of Prior Distributions” of $62,184 listed under “Other Income”, and “Donation Processing Fees” of $42,264 on pages 22-23. We hope someone at the IRS or New York State Attorney General’s office has looked closely at these two highly suspect items.

Nonprofit Quarterly published a comment on the Trump Foundation’s 2015 990PF: “Trump Foundation Finally Reveals Self-Dealing on Newly Uploaded 990.” It’s short. Read it.

Apparently Trump ran the foundation like a personal piggy-bank. He put in money, certainly, but so did many others – illegally (Trump likely did not tell them this) – and then Trump spent it however he liked. One might be led to believe that these “charitable” donations from others were hidden payments for services rendered, bribes or kickbacks.

There is absolutely no reason to believe that Trump’s personal and corporate tax returns are any less filled with scams and lies. These returns are often quite complicated, long and detailed, and it is far easier to bury omissions and misstatements among a welter of valid detail, or to commit inadvertent errors. When someone has the gall to make a stick-out-like-a-giant-sore-thumb misstatement on a bare-bones return like the $41,636 cited above, there’s no telling what they’ll get up to when there’s real money and real complexity involved. We wouldn’t be the slightest bit surprised to see the entire family spend jail time over their tax returns, once Congress gets in gear and investigates them.

Trump’s tax returns may not be propaganda in and of themselves, although he has certainly used the Trump Foundation as a propaganda ploy, and his purported wealth, business acumen and brilliant grasp of tax law as propaganda. This will be one situation where Dissemination of the truth to a wider audience is greatly anticipated.

This is the ninth and last installment in our series on counterpropaganda.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind


TRUTH
– Honest opposition is practical, moral, and unbiased.
FOCUS – Address only one or at most two points.
CLARITY – Easily understood without further explanation.
RESONATE – Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
RESPOND – Lies not immediately refuted become the audience’s truth.
INVESTIGATE – Collect and analyze their propaganda to understand their message, target audience & objectives.
SOURCE – Expose covert sources of false propaganda.
REASON – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  •    REASON #8a – Logical Fallacies
  •    REASON #8b – Cognitive Biases
  •    REASON #8c – Continued Influence Effect of Misinformation
  •    REASON #8d – Debiasing Misinformation – Worldview and Backfire

DISSEMINATE – Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

Citations
1. Wikipedia – Counterpropaganda https://en.wikipedia.org/wiki/Counterpropaganda#Dissemination_of_exposed_propaganda

2. Baker, Peter, assisted by Sanger, David E. (2019 Jan 15). “Trump and Putin Have Met Five Times. What Was Said Is a Mystery.” Retrieved 2-25-19 from NYTimes.com: https://www.nytimes.com/2019/01/15/us/politics/trump-putin-meetings.html

3. Schmidt, Michael S. (2017 Dec 28). “Excerpts From Trump’s Interview With The Times.” Retrieved 2-26-19 from NYTimes.com: https://www.nytimes.com/2017/12/28/us/politics/trump-interview-excerpts.html

4. Sakuma, Amanda. (2019 Feb 16). “The House inquiries into the secret Trump-Putin talks are (slowly) taking shape.” Retrieved 2-26-19 from Vox.com: https://www.vox.com/2019/2/16/18227523/house-democrats-investigation-trump-putin

5. Kirkland, Allegra. (2019 Jan 14). “What We Know About The 5 Meetings Between Trump And Putin.” Retrieved 2-26-19 from TalkingPointsMemo.com: https://talkingpointsmemo.com/muckraker/what-we-know-trump-putin-meetings

6. Neufeld, Jennie. (updated 2018 Jul 17). “Read the full transcript of the Helsinki press conference.” Retrieved 2-26-19 from Vox.com: https://www.vox.com/2018/7/16/17576956/transcript-putin-trump-russia-helsinki-press-conference

7. Taylor, Guy, assisted by Meier, Lauren. (2019 Jan 17). “Details of Trump, Putin powwow likely no secret to snooping U.S. intelligence agencies.” Retrieved 2-26-19 from WashingtonTimes.com: https://www.washingtontimes.com/news/2019/jan/17/trump-vladimir-putin-private-talk-likely-known-int/

8. Watson, Kathryn. (Updated 2018 Jul 18). “Trump claims he misspoke about Russian meddling in Putin press conference.” Retrieved 2-26-19 from CBSNews.com: https://www.cbsnews.com/news/trump-meets-with-members-of-congress-today-after-helsinki-putin-meeting-live-updates-2018-07-17/

9. Chozick, Amy; Confessore, Nicholas; and Barbaro, Michael. (2016 Oct 7). Leaked Speech Excerpts Show a Hillary Clinton at Ease With Wall Street. Retrieved 2-23-19 from NYTimes.com: https://www.nytimes.com/2016/10/08/us/politics/hillary-clinton-speeches-wikileaks.html

10. Cohan, William D. (2016 Oct 11). Clinton’s Leaked Wall Street Speeches Reveal, Shockingly, That She Gets Wall Street. Retrieved 2-23-19 from VanityFair.com: https://www.vanityfair.com/news/2016/10/hillary-clinton-leaked-wall-street-speeches

11. Cooper, Alex. (2016 Oct 7). Breaking: Julian Assange Finally Releases October Surprise, U.S. Rights Sold on International Market. Retrieved 2-23-19 from ConservativeDailyPost.com:   https://conservativedailypost.com/breaking-julian-assange-finally-releases-october-surprise-u-s-rights-sold-on-international-market/

12. Krawchenko, Katiana. (2016 Oct 28). The Phishing Email that Hacked the Account of John Podesta. Retrieved 2-23-19 from CBSNews.com: https://www.cbsnews.com/news/the-phishing-email-that-hacked-the-account-of-john-podesta/

13. Maurnane, Kevin. (2016 Oct 21). How John Podesta’s Emails were Hacked and how to Prevent it from Happening to You. Retrieved 2-23-19 from Forbes.com:    https://www.forbes.com/sites/kevinmurnane/2016/10/21/how-john-podestas-emails-were-hacked-and-how-to-prevent-it-from-happening-to-you/#5647bb252476

14. Wikipedia: Podesta Emails. Retrieved 2-23-19 from: https://en.wikipedia.org/wiki/Podesta_emails#Clinton’s_Wall_Street_speeches

15. Wikipedia: Podesta Emails. Retrieved 2-23-19 from: https://en.wikipedia.org/wiki/Podesta_emails#Publication

16. Romerstein, Herbert (2009). “Counterpropaganda: We Can’t Win Without It”, in Strategic influence : public diplomacy, counterpropaganda, and political warfare (PDF). Washington, DC: Institute of World Politics Press. pg. 155. Retrieved 2-21-19 from: https://jmw.typepad.com/files/strategicinfluenceclass_copy.pdf

17. Romerstein 2009, page 138

18. Romerstein 2009, pages 168-170

19. Romerstein 2009, page 170

20. Romerstein 2009, page 171

21. Kruzell, John. (2017 May 12). “Trump-O-Meter: Trump ‘might’ release returns ‘when I’m out of office’.” Retrieved 2-26-19 from PolitiFact.com: https://www.politifact.com/truth-o-meter/promises/trumpometer/promise/1421/release-his-tax-returns-after-audit-completed/

22. Disis, Jill. (2017 Jan 26). “Presidential tax returns: It started with Nixon. Will it end with Trump” Retrieved 2-24-19 from Money.cnn.com: https://money.cnn.com/2017/01/23/news/economy/donald-trump-tax-returns/index.html

23. Easley, Jason. (2016 May 13). “Trump Loses His Temper And Accidentally Reveals Why He Won’t Release His Tax Returns.” Retrieved 2-24-19 from PoliticusUSA.com: https://www.politicususa.com/2016/05/13/trump-loses-temper-accidentially-reveals-release-tax-returns.html

24. Blake, Aaron. (2016 Sep 26). “The first Trump-Clinton presidential debate transcript, annotated.” Retrieved 2-24-19 from WashingtonPost.com:  https://www.washingtonpost.com/news/the-fix/wp/2016/09/26/the-first-trump-clinton-presidential-debate-transcript-annotated/?utm_term=.841a57eff5a2&noredirect=on

25. Carroll, Lauren. (2016 Oct 3). “Factsheet: Donald Trump’s tax returns.” Retrieved 2-24-19 from PolitiFact.com: https://www.politifact.com/truth-o-meter/article/2016/oct/03/donald-trump-tax-returns-factsheet/

26. Waldman, Paul. (2018 Jun 14). “The Trump Foundation was one big scam, according to the New York attorney general. What a shock.” Retrieved 2/25/19 from WashingtonPost.com: https://www.washingtonpost.com/blogs/plum-line/wp/2018/06/14/the-trump-foundation-was-one-big-scam-according-to-the-new-york-attorney-general-what-a-shock/?utm_term=.52cb8f779fb0

27. Cousins, Farrom. (2018 Nov 26). The Ring of Fire: “Trump FAILS To Get Charity Fraud Lawsuit Against Him Dismissed.” Retrieved 2-24-19 from TROFire.com: https://trofire.com/2018/11/26/trump-fails-to-get-charity-fraud-lawsuit-against-him-dismissed/

28. Link to our blog on Trump Charitable Foundation

Reason – Counterpropaganda Principle #8d

Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  • Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
  • Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
  • Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
  • Propaganda targets emotional reactions, not cognitive reasoning. [1]
  • Counterpropaganda must target emotions as well as reason. [1]
  • Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
    • Continued Influence Effect
    • Familiarity Backfire Effect
    • Overkill Backfire Effect
    • Worldview Backfire Effect

This the forth of four posts pertaining to REASON – Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation – Worldview and Backfire.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

Download or print the above PDF file HERE.

Cognitive Biases, Misinformation and the Brain

Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor was cited in our prior postings Reason #8b & 8c. We repeat it here because it gives a succinct and clear explanation of an issue of critical importance to this topic. It appeared in response to a Los Angeles Times article “Measles is deadly; vaccines are not,” (2016 Feb. 10). [Emphasis added.]

Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.

When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).

In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.

A wise and wary frontal cortex knows all too well how wrong can feel so right.
Dale O’Neal, Clinical Psychologist, 16 February, 2019

Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it’s all false. It’s the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka “vigilant frontal cortex”) can defeat the amygdala when it’s in full fear mode.

Corrections in the Face of Existing Belief Systems: Worldview and Skepticism [2]
The following is a condensation of an important metastudy, “Misinformation and its Correction: Continued Influence and Successful Debiasing.” Lewandowsky, Ecker, Seifert, Schwarz & Cook. (2012 Sep 17). Pgs 27-35 (Link to copy) [2]

The Importance of the Recipient’s Worldview

We more readily accept statements which are consistent with our current beliefs, and it follows that our “worldview” or “ideology” are critical factors in the persistence of misinformation. For example, “Birther”-belief and WMD-related claims persists among Republicans despite retractions, but not among Democrats. Conservatives are better than liberals  when judging risks from the consequences of higher oil prices. Conservatives are also better at recognizing the magnitude of risks from “peak oil.” Our pre-existing attitudes push us to persist in believing in worldview-consistent misinformation after retraction. Retractions of misinformation that President Bush’s tax cuts in the early 2000’s increased revenues and that Iraq had WMD’s were effective only among Democrats; it backfired among Republicans who became more committed to the misinformation. When people saw messages highlighting the adverse effects on health from climate change, Democrats increased their support for climate mitigation policies; it backfired among Republicans and their support declined.

When solutions are “framed” to fit the worldview of the audience, they accept them more easily. Republicans find “carbon offset” charges more acceptable than a “tax;” Democrats don’t bridle at the word “tax” and thus accept either term.

Even public-health messages can polarize. When information that Type 2 diabetes can be caused by poor nutrition and junk food – common among the poor – is presented, Democratic support increases for ameliorating changes in public policy, but among Republicans it declines.  Republicans disbelieved rebuttals of the “death panel” myth; Democrats accepted them. Even product brands are affected. Those who love a particular brand suffer loss of self-esteem when they read negative information about the brand; those lacking emotional attachment to the brand remain unaffected.

Our pre-existing beliefs (“worldview”) affect the effectiveness of retractions of misinformation about real-world events. For example, supporters of the 2003 invasion of Iraq found it harder to accept the falsity of reports of WMDs in Iraq. The political-science literature confirms this view: the less one cares about a particular issue, the more one is influenced by factual or corrective information about it. Thus political issues per se do not necessarily lead to polarization.

Making things worse: Backfire effects

Society is damaged by misinformation when it concerns real-world issues such as climate change, tax policies, or decisions to go to war. In the real world, people tend to rely on misinformation that fits into their worldview and will be relatively immune to corrections. Retractions tend to backfire and strengthen their pre-existing beliefs.  In studies where people are given information that challenges their worldview, they will “counter-argue” or remain unmovable (e.g., “We differ. I don’t believe them.”  In other studies, people exhibit “motivated skepticism.” They uncritically accept arguments for their own position, but are highly skeptical of opposing arguments, and actively counter-argue to deride or invalidate any information. Such “boomerang” effects also appear when health messages are presented.

Belief polarization appears when the same information causes opposing views to further diverge. When religious believers and non-believers were exposed to a fictitious report disproving the Biblical account of the Resurrection, belief increased among believers and skepticism increased among non-believers. When presented with identical descriptions of nuclear power technological breakdowns, supporters focus on the safeguards that worked to prevent a worse accident, but opponents focused on the fact that the breakdown occurred in the first place. Techniques used to reduce belief polarization  are very similar to techniques used to overcome worldview-related resistance to correction of misinformation.

Feelings of affiliation with a source are also a factor in information acceptance. Republicans accepted corrections of the “death panel” myth more readily when given by a Republican politician. Source credibility is also a function of belief: when you believe a statement, you judge its source as more credible. This cycle of resonance between belief and credibility can prevent opposing information from being judged sufficiently credible to overturn beloved beliefs, however false. One study showed that belief-threatening scientific evidence can lead to the discounting of the scientific method itself, and faith then trumps experiential facts. Even education can fail. Another study showed increasing education made it more likely that Democrats would view global warming as a threat, and less likely for Republicans. In another study, the more educated the Republican, the more likely they were to believe that President Obama was a Muslim (he is not). Few Democrats held this mistaken belief, and their level of education was not a factor.

We cannot yet completely rely on party affiliation or any other worldview measure in order to predict responses to misinformation correction. Neither do we completely understand the underlying cognitive processes. It may be that when we are heavily invested in our personal worldview, changing it to accommodate inconsistencies is too costly or too difficult. Our worldview may well function as an overall plan for processing related information, one which forces the rejection of uncomfortable new truths.

Taming worldview by affirming it

Studies show that debiasing messages and retractions must be tailored to fit into the specific audience’s worldview. Research shows that when solutions are “framed” to fit the worldview of the audience, they accept them more easily. For example, “eco-centric” people, likely to dismiss nanotechnology as inherently unsafe, accept it more readily when presented as environmental protection. Climate-change deniers are less resistant when information is presented as a business opportunity for the nuclear industry. Even simple changes in wording will increase acceptance when they make information less worldview-threatening. Republicans find “carbon offset” charges more acceptable than a “tax;” Democrats don’t bridle at the word “tax” and thus accept either term.

When messages are presented as “self-affirming” by including the opportunity to affirm their basic values as part of the corrective process – they are less worldview-threatening. When people were invited to write about a time when they felt especially good about themselves after acting on a value important to them, they became more receptive to potentially worldview-threatening messages. Encouraging self-affirmation seems to give the facts a “fighting chance.”

Such self-affirmation helped people accept negative information about a favorite brand product, apparently by letting them lower their brand esteem rather than their self-esteem. Helping people to face their worldview inconsistencies helps them to accept worldview corrections.

Skepticism—key to accuracy

Skepticism often reduces susceptibility to misinformation when people question the origin of information that may later turn out to be false. For example, people who initially questioned that finding and destroying WMD’s was the reason for the 2003 invasion of Iraq were more accurate in processing later war news. Initial suspicion brought greater accuracy when processing later information as well as greater accuracy in recognizing correct information, but did not cause “cynicism” – a blanket denial of all war-related information. Courtroom studies show that mock jurors continue to be influence by evidence later ruled inadmissible, even when they claim they are not, unless they became suspicious of the motives of the prosecutor who introduced the evidence.

Such skepticism interweaves with trust, research shows. Trust is fundamental in society; while distrust is often corrosive it can have a positive function. One study showed that people solve non-routine problems better after viewing a face rated as “untrustworthy” by others. In contrast, eliciting trust in people helps them perform better on routine (but not non-routine) problems. This suggests that distrust sensitizes people to their environment, awakening them to the non-routine.

“Nudging” is not tied to a specific delivery vehicle, which may not reach target audiences. Debiasing requires that the target audience actually receives the corrective information – difficult at best – but “nudging” automatically reaches everyone making the relevant choice.

Healthy skepticism or induced distrust seems to help us reject misinformation. These benefits apparently come from the non-routine, more “lateral” information processing that is primed when people are skeptical or distrustful. Skepticism at the time of message exposure helps more than skepticism which arises afterwards, and misinformation can prevail even when it or its source is later identified as intentionally misleading. In one study, people were presented with an attitude-changing report about a heroin-addicted child and the effectiveness of social youth-assistance programs. They then received a retraction, stating that the report was inaccurate because of either a mix-up (error condition) or because the author invented sensational ‘facts’ (deception condition). Retractions, especially concerning the deception, did bring participants to change their minds, but the effects of misinformation could not be erased in either condition. Misinformation’s effects on attitude lingered even after a retraction established the author had lied.

Using misinformation to inform

Brief interventions (e.g. “myth-vs.-fact”) do not work, but careful and prolonged dissections of incorrect arguments may.  One experiment compared a standard teaching lecture with an alternative which explicitly refuted 17 common misconceptions about psychology while leaving other misconceptions unchallenged. Explicit refutation was the more successful method. One review of the literature likewise argues for argumentation and rebuttal in science education and that classroom studies “…show improvements in conceptual learning when students engage in argumentation.”

Argumentation and engagement with an opponent may work even in the political arena. An analysis of over 40 opinion polls overthrows conventional wisdom which claims that winning a debate requires avoiding dialog and highlighting your own issues. Some studies suggest even explicit misinformation can be used as a teaching tool. One study had students study “denialist” literature to learn about climate science – by analyzing misinformation and developing the skills required to detect its flaws, they gained actual knowledge. The in-depth discussion of misinformation and correction can help work through inconsistencies in their understanding and promote the acceptance of corrections.

Debiasing in an Open Society
Continuing the condensation of “Misinformation and its Correction” Pgs 35-36 [2]

Information moves fast and far in modern society and false information quickly takes root among the unwary. Knowledge of the spread and persistence of misinformation and how to effectively counteract it is of great practical importance. Consider Rwanda, where a year-long large-scale field experiment took place in 2008-09. It found that a radio soap opera built around messages of reducing intergroup prejudice, violence, and trauma altered listeners’ perceptions of social norms and their behavior – albeit not beliefs – when compared to a control group which heard a health-focused soap opera. This confirmed that large-scale change can be achieved using conventional media.

Concise recommendations for the practitioner

  • Consider what gaps in people’s mental event models are created by your debunking and fill them with an alternative explanation.
  • Repeated retraction can reduce the influence of misinformation, although this also increases the risk of a backfire effect when the original misinformation is repeated and thereby rendered more familiar.
  • To avoid making people more familiar with misinformation (thus risking a familiarity backfire effect), emphasize the facts you wish to communicate rather than the myth.
  • Provide an explicit warning before mentioning the myth, to ensure people are cognitively on guard and less likely to be influenced by the misinformation.
  • Ensure your material is simple and brief. Use clear language and graphs where appropriate. If the myth is simpler and more compelling than your debunking, it will be cognitively more attractive and you risk an overkill backfire effect.
  • Consider whether your content may be threatening to the worldview and values of your audience. If so, you risk causing a worldview backfire effect, which is strongest among those with firmly held beliefs. This suggests that the most receptive people will be those who are not strongly fixed in their views.
  • If one must present evidence that may be threatening to the audience’s worldview, possible ways to reduce the worldview backfire effect are (a) present your content in a worldview-affirming manner (e.g., by focusing on opportunities and potential benefits rather than risks and threats) and/or (b) encourage self-affirmation.
  • The role of worldview can also be circumvented by focusing on behavioral techniques such as the design of choice architectures rather than overt debiasing.

Three Problem Areas for Future Research
Continuing the condensation of “Misinformation and its Correction” Pgs 36-37 [2]

This survey of the literature enables us to provide a range of recommendations and draw some reasonably strong conclusions. However, this survey also identified a range of issues about which relatively little is known and which deserve future research attention. We wish to highlight three such issues.

The Role of Emotions is mixed. Emotionally stirring misinformation is not necessarily accepted more than emotionally neutral misinformation. Yet information likely to arouse others is passed on more often than truthful information, meaning that misinformation persistence may depend on its emotiveness. Also, information and retractions which challenge people’s worldviews makes them emotionally defensive.

Concerning the role of Individual Differences such as race or culture individual differences, people’s responses to the same information differ depending on their personal worldviews or ideology, but very little is known about other individual-differences variables. Intelligence, memory capacity, memory updating abilities, and ambiguity tolerance are just a few factors that could potentially mediate misinformation effects.

Concerning Social Networks, while “cyber-ghettos” have been studied, there is little understanding of the processes of misinformation dissemination through complex social networks and how these facilitate the persistence of misinformation in selected segments of society.

Psychosocial, Ethical, and Practical Implications
Continuing the condensation of “Misinformation and its Correction” Pgs 37-39 [2]

We conclude by discussing how misinformation effects can be reconciled with the notion of human rationality, before we address some limitations and ethical considerations surrounding debiasing, and point to an alternative behavioral approach to counteract the effects of misinformation.

The evidence shows: people can’t fully update memories with corrective information; worldview overrides fact; corrections can backfire. It’s tempting, but premature, to conclude that people are “irrational” or cognitively “insufficient.”  For example, when belief polarization was studied within a Baysian network which captures the role of hidden psychological variables (e.g. during belief updating), it was found that behavior that initially appears “irrational” may actually represent a normal, rational integration of prior biases with new information.

Debiasing has ethical issues. We need a well-informed population, but debiasing techniques can also be used to further misinform people. Correcting misinformation is cognitively indistinguishable from misinforming people by disassembling their previously-held correct beliefs. The public must have a basic understanding of misinformation effects: Being aware of the fact that propagandists “throw mud” because they know it “sticks” is an important aspect of developing a healthy skepticism and a well-informed populace.

Debiasing can be inefficient. Backfire effects make debiasing most effect among people lacking strong beliefs concerning the misinformation. When misinformation agrees with a strong worldview, retractions can do more harm than good by further strengthening the misbelief. If debiasing can’t be framed in a worldview-congruent manner, don’t bother. Alternatively, you can ignore the misinformation altogether and seek more direct behavioral interventions. “Nudging” techniques can encourage certain decisions over others, without preventing people from making a free choice. People will adopt low-emission behaviors when “nudged” by tax credits even when already misinformed about climate science. Organ donation rates nearly double with the simple and transparent “nudge” of changing the default option from “opt-in” to “opt-out.”

Skepticism often reduces susceptibility to misinformation when people question the origin of information that may later turn out to be false. Initial suspicion brings greater accuracy when processing later information as well as greater accuracy in recognizing correct information, but does not cause the blanket denial of “cynicism.”

“Nudging” is not tied to a specific delivery vehicle, which may not reach target audiences. Debiasing requires that the target audience actually receives the corrective information – difficult at best – but “nudging” such as described above “automatically” reaches everyone who is making the relevant choice. It is especially applicable in these situations: when an entire population must adapt quickly to prevent negative consequences (e.g. the Montreal Protocol to rapidly phase out CFCs to protect the ozone layer); when ideology is likely to prevent the success of debiasing; when there are organized efforts to deliberately misinform people (e.g. tobacco smoke, climate change).

Vested interests can persist for decades in dispensing misinformation, as has the tobacco industry long after the causal link between smoking and lung cancer was established. By claiming that after 1964 there was still “room for responsible disagreement” with the U.S. Surgeon General’s conclusion that tobacco was a major cause of death, they are arguably trying to replace one myth (“tobacco does not kill”) with another (“the tobacco industry did not know it”). The primary strategy of such vested interests is to spread doubt about the uncertainty of scientific conclusions. Most people don’t understand the difference between scientific proof and syllogistic proof, and most people think a little uncertainty is as meaningful as a large uncertainty. We need to understand the cognitive mechanisms of misinformation effects, but we also need to monitor these socio-political developments in order to better understand why certain misinformation can gather traction and persist in society.

Download or print the above PDF file HERE.

This is the eighth installment (part d) in our series on counterpropaganda.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind


TRUTH
– Honest opposition is practical, moral, and unbiased.
FOCUS – Address only one or at most two points.
CLARITY – Easily understood without further explanation.
RESONATE – Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
RESPOND – Lies not immediately refuted become the audience’s truth.
INVESTIGATE – Collect and analyze their propaganda to understand their message, target audience & objectives.
SOURCE – Expose covert sources of false propaganda.
REASON – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  •    REASON #8a – Logical Fallacies
  •    REASON #8b – Cognitive Biases
  •    REASON #8c – Continued Influence Effect of Misinformation
  •    REASON #8d – Debiasing Misinformation – Worldview and Backfire

DISSEMINATE – Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

Citations
1. Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy. Pages 14-27.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner

 

Reason – Counterpropaganda Principle #8c

Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  • Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
  • Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
  • Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
  • Propaganda targets emotional reactions, not cognitive reasoning. [1]
  • Counterpropaganda must target emotions as well as reason. [1]
  • Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
    • Continued Influence Effect
    • Familiarity Backfire Effect
    • Overkill Backfire Effect
    • Worldview Backfire Effect

This the third of four posts pertaining to REASON – Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation – Worldview and Backfire.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

Cognitive Biases, Misinformation and the Brain

Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor was cited in our prior posting Reason #8b – Cognitive Biases. We repeat it here because it gives a succinct and clear explanation of an issue of critical importance to this topic. It appeared in response to a Los Angeles Times article “Measles is deadly; vaccines are not,” (2016 Feb. 10). [Emphasis added.]

Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.

When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).

In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.

A wise and wary frontal cortex knows all too well how wrong can feel so right.
Dale O’Neal, Clinical Psychologist, 16 February, 2019

Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it’s all false. It’s the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka “vigilant frontal cortex”) can defeat the amygdala when it’s in full fear mode.

Misinformation and its Correction [2]
The following is a condensation of an important metastudy,Misinformation and its Correction: Continued Influence and Successful Debiasing.” Lewandowsky, Ecker, Seifert, Schwarz & Cook. (2012 Sep 17). Pgs 14-18 (Link to copy) [2]

Assessing the Truth of a Statement
Lies carry no warning labels. We usually cannot detect lies or errors until they are later retracted or corrected. In everyday conversation we normally default to the assumption that what we hear is true, clear and relevant. Research studies “…suggest that comprehension of a statement requires at least temporary acceptance of its truth before it can be checked against relevant evidence. On this view, belief is an inevitable consequence—or indeed precursor—to comprehension.” [Emphasis added] [2

Is it compatible with what I believe?

Compatibility promotes acceptance. New information that fits with previously accepted information “feels right;” we accept it and file it with the rest. From then on, it becomes as highly resistant to change as was the previously accepted information. Rejecting it would involve rejecting the previously accepted and now-resistant information. Statement acceptance increases when printed in high color contrast and an easy-to-read font, when it rhymes, and when spoken in a familiar accent.

Is the message coherent?

The message should fit well within a sensible, broader story. Studies on mental models and on jury decision making have shown this to be especially true when the message cannot be assessed in isolation because it depends on other, related pieces of information. A compelling story organizes the available information, lacks internal contradictions, and is compatible with our common assumptions about human motivation and behavior. Good, coherent stories leave no gaps and are easily remembered. They are highly resistant to change because each element is supported by the other elements; altering one element alters the whole, making it less plausible. We understand and remember coherent stories more easily than incoherent stories; this ease-to-remember serves to make them more believable and more resistant to change.

Is the source credible?

A source’s credibility increases in importance as the listener’s motivation and understanding declines, and the more credible the source, the more we are persuaded. Yet non-credible sources can still influence because people are often insensitive to context: testimony under oath is no more believed than testimony not under oath; studies funded by cigarette producer Corp. Z, are as persuasive as studies from an independent commission with nothing to gain or lose. The gist of an interesting, coherent message is remembered long after its source is forgotten; good stories from untrustworthy sources will be remembered far longer than poor stories from credible sources. Mere repetition of a name, place or even an idea makes it more familiar, and hence more credible, even when it’s the same source repeating it over and over again. Even when a message was initially rejected, it may be accepted at a later time simply because it has become familiar.

Do others believe it?

Repetition increases acceptance. One 1945 study showed that repetition was the strongest predictor of belief in wartime rumors, possibly because it creates the illusion of social consensus. When others believe it, we feel it’s probably true because we hear common, likely messages far more often than weird unlikely messages. (We evolved to harbor that expectation.) Such familiarity usually indicates social consensus, our innate bias tells us. Even when information has become familiar for poor reasons such as mere repetition of the same statement by the same source, the more often people hear or read it, the more they will believe it is widely accepted. Thus a single repetitive voice is treated as if it were a chorus. The “echo chamber” of social media networks creates such a chorus and their repetition is especially influential, which explains the explosive proliferation of Russian bots, trolls and fake news. This can lead to “pluralistic ignorance,” the divergence between how common a belief actually is and how common we think it is. For example, during the lead-up to the 2003 invasion of Iraq, the majority of American who wanted multilateral intervention believed themselves to be in the minority because of the unilateral interventionists’ dominance of the American media, while the actual minority of unilateral interventionists falsely believed they were the majority. A 2008 study showed that Australians with strongly negative attitudes towards Aboriginals or asylum-seekers over-estimated support for their attitudes by 67% and 80%, respectively. The 1.8% of people in the sample with strongly negative attitudes towards Aboriginal Australians thought that 69% of all Australians (and 79% of their friends) agreed with them. Such false social consensus can solidify and maintain belief in misinformation. How best to correct such misinformation? Correcting faulty truth assessments involves a competition between the perceived truth value of misinformation and correct information. Ideally the correction will undermine the perceived truth of misinformation and enhance the acceptance of correct information. But such corrections often fail to work as expected due to the presence of four cognitive problems, beginning with the Continued Influence Effect.

The above sheet is available as a downloadable or printable PDF file HERE.

The Continued Influence Effect:
Retractions Fail to Eliminate the Influence of Misinformation
Continuing the condensation of “Misinformation and its Correction” Pgs 18-23 [2]

Numerous studies show that retractions are ignored and original misinformation is remembered even when the readers have no motivation to believe either. A common test narrative involves a warehouse fire initially thought to have been caused by gas cylinders and oil paints negligently stored in a closet. Readers then read this retraction: “The closet was actually empty.” Others see no retraction. When asked in subsequent testing, “What caused the black smoke?”, “Did you see a retraction?,” among other questions, those who did read the retraction continue to rely on the initial misinformation, even when they believed and understood the retraction and could later recall the retraction. At best such reliance on misinformation was reduced by 50%; in many studies, the retraction had no effect whatsoever.

Retractions and corrections in the media have even less effect, whether they immediately follow the initial report or at a later date. In the studies, clarifications of the misinformation – “paint and gas were never on the premises” – backfired; people became even more likely to rely on the misinformation. While some additions to the correction helped – “a truckers’ strike prevented the expected delivery of the items” – continued reliance on the misinformation could still be detected. Numerous additional studies have yielded the same results.

One possible explanation has to do with the mental models we create of unfolding events. In the warehouse scenario, negligence led to improper storage of flammables, followed by an electrical fault igniting the materials. When the negligence and inflammable materials items are retracted, a hole is left in the mental narrative, and the narrative no longer “makes sense” unless the false assertion is retained. When questioned, study participants continued to respond with the misinformation despite being aware of the correction. If they are asked to explain why the misinformation might be true, it then becomes even more difficult to correct.

Studies show that people fill gaps in episodic memory with available inaccurate but “fitting” information. It may be that a complete but inaccurate model “feels” preferable to a model which is correct but incomplete. It “feels better” to stick to the original and coherent (but incorrect) narrative than to feel discomfort caused by a true-but-incoherent narrative.

Memory retrieval failure is another possible explanation. When valid and invalid memories compete for automatic activation, they might not be properly sorted. Questioning may activate misinformation when the misinformation supplies a plausible account of an event. The person would then need to actively think about which memories were true and which were false in order to sort them out, an activity uncomfortable to do.

Thirdly, there is some evidence that retraction processing is like attaching a “Negation Tag” to a memory entry (e.g., “there were oil paints and gas cylinders—NOT.” Such mental tags can be lost, leaving only the misinformation behind. If this is true, negations should be more successful when they are an affirmation of an alternate attribute. “Jim is tidy” works better than “Jim is not messy” when negating the original message of “Jim is messy.” But “Jim is charismatic” has no such alternative other than “not charismatic.” People will replace “messy” with “tidy” far more reliably than replacing “charismatic” with “not charismatic.”

In another view, the effect of fluency (smooth processing of the information during a later re-exposure) can cause misinformation –  even when not recollected – to increase perceived familiarity and coherence of a narrative. If so, retractions which repeat the misinformation can fail or backfire because they already “feel familiar.” Thus, reading “no paints and gas were present” reinforces the familiar misinformation that “paints and gas were present.” The retraction then becomes a repetition of the misinformation and makes the memory even stronger. “I heard that before, so there’s probably something to it.” Thus, because retractions repeat the misinformation, they may backfire by increasing familiarity and fluency of processing of the misinformation.

Such fluency-based effects make difficult the correction of misinformation. “Myth vs. Fact” approaches are especially ineffective. Immediately after reading a hand-out from the U.S. Center for Disease Control, readers correctly identified the myths and the facts. Only 30 minutes later, they identified more myths as facts than people who had never read the handout at all, and they based their future plans (to refuse inoculations) on their erroneous recollections.  Older adults and children are most susceptible to such fluency-based backfire effects; they are also the most likely to accept explicit messages that the information is false.

When corporations pretend to be associated with events such as the Olympic Games – known as “ambushing” – not only is the ambush successful, but attempts to expose them with counter-ambushing usually backfires and leads people to believe in the association even more.

Social Reactance can cause retractions to be ineffective. Many people don’t like being told what to think and how to act and may reject authoritative retractions. Numerous studies have presented mock jurors with evidence later ruled inadmissible. When jurors are asked to disregard the tainted evidence, they show higher conviction rates when an “inadmissible” ruling is accompanied by a judge’s extensive legal explanations than when the inadmissibility was left unexplained.

The above sheet is available as a downloadable or printable PDF file HERE.

Reducing Misinformation’s Impact
Continuing the condensation of “Misinformation and its Correction” Pgs 23-27 [2]

Pre-exposure warnings

Explicit up-front warnings that people are about to encounter misleading information can significantly reduce the influence of misinformation. One study found that such warnings must specifically explain the continuing influence of misinformation, not just mention its presence. This can be applied in advertising, in pseudoscientific or historical fiction, and especially in court where jurors often hear information they are later instructed to disregard. Because our default behavior is to assume presented information is valid, warnings must be pre-exposure to allow the recipients to “tag” it as suspect – afterwards is too late. Early warnings may work by inducing a temporary state of skepticism which helps to increase our ability to discriminate between true and false information.

Repeated Retractions

When misinformation is repeated, repetition of retraction can alleviate, but not eliminate, misinformation’s effects. However, the effect of a single presentation of misinformation persisted just as strongly after three retraction repetitions as it did after a single retraction. Misinformation effects are extremely hard to eliminate or drive below a base level of “irreducible persistence,” however strong the retraction(s). Several explanations for this phenomenon have been offered.

(1). Initial misinformation may be automatically processed into memory. Because memory change or elimination requires conscious activity; the person must be aware of the misinformation’s automatic effect on their reasoning.

(2). When single misinformation presentations are “tagged – not” by a retraction, additional retractions don’t increase the strength of the “tag.” Because misinformation effects are strengthened by repetition, repetition of retractions can help by increasing the “tag – not” strength.

(3). A “methinks they protest too much” effect may result from repetitions of corrections.

(4). When misinformation is repeated within the retraction, it can backfire and increase acceptance of the misinformation.

Repetition of misinformation has a stronger and more reliable negative effect than the positive effect of repetition of retractions. This is especially unfortunate for social networks, where lies and propaganda quickly spread but corrections do not.

Filling the Gap – Providing an Alternative Account

In studies using the “gas, oil paints and warehouse fire” or equivalent scenario, it was found that retractions such as “there was no negligent storage of gas and oil paint in a closet” did not work, probably because they left holes in the narrative. Alternative explanations –  “arson materials were found ” did work, probably by filling the narrative gap. Such alternative explanations must be plausible, account for the initial narrative’s main points and, ideally, explain why the initial misinformation was thought correct. Explanations of the motivation behind the incorrect report are particularly successful. “Someone overheard someone guessing at the cause and mistook it for fact.” Merely mentioning an alternative scenario will not reduce reliance on misinformation; it has to be solidly integrated into the explanation. Simple explanations are best; people will reject correct but complex explanations and cling to the wrong but simple explanation. Providing too many counter-arguments can backfire through “overkill.”

In political misinformation and corrections, where lies often abound and everyone’s motives are suspect, people often place more suspicion on explanations from some sources.

In conclusion, there are three established techniques to reduce the continued influence of misinformation: Pre-exposure Warnings, Repeated Retractions, and – most effective – Alternative Explanations that fill the narrative gap. Unfortunately it may take time to determine the correct explanation, and time is often critically short.

The above sheet is available as a downloadable or printable PDF file HERE.

This is the eighth installment (part c) in our series on counterpropaganda.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind


TRUTH
– Honest opposition is practical, moral, and unbiased.
FOCUS – Address only one or at most two points.
CLARITY – Easily understood without further explanation.
RESONATE – Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
RESPOND – Lies not immediately refuted become the audience’s truth.
INVESTIGATE – Collect and analyze their propaganda to understand their message, target audience & objectives.
SOURCE – Expose covert sources of false propaganda.
REASON – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  •    REASON #8a – Logical Fallacies
  •    REASON #8b – Cognitive Biases
  •    REASON #8c – Continued Influence Effect of Misinformation
  •    REASON #8d – Debiasing Misinformation – Worldview and Backfire

DISSEMINATE – Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

Citations
1. Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy. Pages 14-27.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner

 

 

Reason – Counterpropaganda Principle #8b

Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  • Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
  • Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
  • Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
  • Propaganda targets emotional reactions, not cognitive reasoning. [1]
  • Counterpropaganda must target emotions as well as reason. [1]
  • Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
    • Continued Influence Effect
    • Familiarity Backfire Effect
    • Overkill Backfire Effect
    • Worldview Backfire Effect

This the second of four posts pertaining to REASON – Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation – Worldview and Backfire.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

Cognitive Biases, Misinformation and the Brain

Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor gives a succinct and clear explanation. It appeared in response to a Los Angeles Times article “Measles is deadly; vaccines are not,” (2016 Feb. 10). [Emphasis added.]

Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.

When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).

In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.

A wise and wary frontal cortex knows all too well how wrong can feel so right.
Dale O’Neal, Clinical Psychologist, 16 February, 2019

Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it’s all false. It’s the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka “vigilant frontal cortex”) can defeat the amygdala when it’s in full fear mode.

Confirmation Bias: The tendency to search for, interpret, focus on and remember information in a way that confirms your preconceptions. By re-confirming your views, you reduce inconsistencies in both information and belief, thereby reducing minimizing uncomfortable feelings of cognitive dissonance.

Counterpropaganda and the Difficulty of Exposing Cognitive Reasoning Errors [3]
Propagandists exploit our human cognitive biases and logical fallacies in order to influence us. They know our weaknesses, shape their message to be credible and emotionally attractive, and slip their propaganda through these holes in our defenses. Counterpropagandists try to negate propaganda messages through exposing and resolving the target audience’s errors in judgment. This method works similarly to SOURCE (Counterpropaganda Principle #7) which, by revealing the true origin of a propaganda message, reduces the credibility of its broadcaster by exposing them as an enemy, dupe or liar. The hope is that the target audience, when made aware of their own logical fallacies, will then reject messages based on this faulty reasoning.

Unfortunately, recent studies strongly suggest that the effectiveness of propaganda is not based on cognitive reasoning errors, but upon emotional reactions, and propaganda works best when focusing on our emotions. (see Propaganda Principles #4 Blame, #5 Provoke, #6 Crisis, #7 Emotional Symbols, #8 Pander  ) Therefore, exposing a group’s logical errors is not as effective in refuting propaganda messages as was originally believed.

Jacques Ellul [3][4] argues that the speed at which events occur, become outdated and uninteresting causes people to be too inattentive and unaware to seriously think about current events. The more superficial we become, the more effective becomes propaganda. We are just too superficial to care about our own cognitive biases and logical fallacies. Yet Ellul stresses that this element of counterpropaganda remains critical because it exposes our vulnerability to propaganda based upon our own mental vulnerabilities.

Cognitive Biases [5]

Cognitive biases are systematic patterns of deviation from rationality in judgment. Although we all share the same physical world, we each interpret our sensory input from this world to construct our own “subjective social reality.” It is this interpretive construction, not the objective input from the world, which often dictates our social behavior in the world. Thus innate cognitive biases can lead to perceptual distortions, inaccurate judgments, illogical interpretations, or what is often called irrationality. Our cognitive biases are adaptive and beneficial when they lead to more effective decisions and actions within a given context, particularly when rapidity outweighs accuracy, when information is unavailable, or when information proliferates beyond comprehension. These biases appeared – and then persisted – during the course of human evolution because their benefits outweighed their detriments. But as human population grows and social relationships become ever more numerous and complex, the balance of benefits may be shifting towards the negative. We now need to know that these biases exist, how they work, and how they can be neutralized when necessary.

Here are thirty cognitive biases which pertain to propaganda and counterpropaganda. [6]

Affinity Bias: The tendency to associate with people like ourselves. Similarity can be based on race, religion, politics, age, sex, nationality, education, wealth, and so on.

Anchoring (or Focalism) Bias: The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).

Availability Cascade: A self-reinforcing process in which a collective belief gains more and more plausibility through increasing repetition in public discourse (“repeat something long enough and it will become true”). (See Propaganda Principle #2 – REPEAT.

Availability Heuristic: The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.

Backfire (or Continued Influence) Effect: The reaction to retractions, correction and disconfirming evidence by strengthening one’s previous reliance on misinformation.

Bandwagon Effect: The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.

Belief Bias: Evaluation of the logical strength of an argument is biased by your belief in the believability of the conclusion.

Confirmation Bias: The tendency to search for, interpret, focus on and remember information in a way that confirms your preconceptions. By re-confirming your views, you reduce inconsistencies in both information and belief, thereby reducing minimizing uncomfortable feelings of cognitive dissonance.

Correspondence Bias: In contrast to interpretations of our own behavior, we tend to unduly emphasize the presumed internal characteristics (character or intentions), rather than external factors, when explaining the behavior of others. This effect has been described as “the tendency to believe that what people do reflects who they are.”

Declinism: The predisposition to view the past favorably (“rosy retrospection”) and future negatively.

Dunning-Kruger Effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.

Empathy Gap: The tendency to underestimate the influence or strength of feelings, in either oneself or others.

Framing Bias: Using a too-narrow approach and description of the situation or issue. It is a setting which causes people to react based on the way the brain makes comparisons – loss vs gain : inexpensive vs expensive : better vs worse.

Hindsight Bias: The “I-knew-it-all-along” effect; the tendency to see past events as being predictable at the time those events happened.

Illusion of Control: The tendency to overestimate one’s degree of influence over other external events.

Illusion of Validity: Belief that our judgments are accurate, especially when available information is consistent or inter-correlated.

Illusory Truth Effect: A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual truth. These are specific cases of truthiness.

Irrational Escalation: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.

Overconfidence Effect: Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.

Priming Bias: The tendency to be influenced by a preconceived idea caused by what someone else has said. It is a setting which causes people to react based on how the brain groups information. You are more likely to recognize the word ‘n_rse’ as ‘nurse’ when it follows ‘doctor,’ but as ‘norse’ when it follows ‘viking.’

Reactance: The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice. Related to ‘reverse psychology.’

Reactive Devaluation: Devaluing proposals only because they purportedly originated with an adversary.

Rhyme as Reason Effect: Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense’s use of the phrase “If the gloves don’t fit, then you must acquit.”

Selective Perception: The tendency for expectations to affect perception.

Semmelweis Reflex: The tendency to reject new evidence that contradicts a paradigm you use.

Self-Serving Bias: The tendency to more often claim responsibility for successes than for failures. The tendency to evaluate ambiguous information in a way which benefits your own interests.

Social Comparison Bias: The tendency, when making decisions, to favor potential candidates who don’t compete with one’s own particular strengths.

Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.

Third-Person Effect: Belief that mass communicated media messages have a greater effect on others than on themselves.

Zero-Risk Bias: Preference for reducing a small risk to zero over a greater reduction in a larger risk.

Cognitive biases such as these enable us to quickly process information and to have greater understanding about what the rest of our group is thinking and feeling. In fight-or-flight situations for our ancestors, rapid decision making could easily make the difference between survival and death. Similarly, their group (family, clan, tribe, etc.) was essential to their individual survival, and they’d better know how to survive within the group and cooperate with the rest of the members. The cognitive biases evolved for good reasons, and – like it or not – we’re stuck with them. If we don’t become aware of our innate biases and learn to recognize and deal with them, the propagandists among us will continue to use them against us. Your own biases are either your tools, or theirs.

There are many cognitive biases. Your Bias Is lists 24 in a very user-friendly format: anchoring, availability, backfire, barnum effect, belief, bystander, confirmation, curse of knowledge, declinism, Dunning-Kruger effect, framing, fundamental attribution error, groupthink, halo, in-group, just world hypotheses, negativity, optimism, pessimism, placebo, reactance, self-serving, spotlight, sunk cost fallacy.

Wikipedia’s List of Cognitive Biases List of Cognitive Biases contains 190 items: 113 decision-making, belief & behavioral biases, 28 social biases, and 49 memory errors and biases. Don’t be biased against looking them up.

This is the eighth installment (part b) in our series on counterpropaganda.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind


TRUTH
– Honest opposition is practical, moral, and unbiased.
FOCUS – Address only one or at most two points.
CLARITY – Easily understood without further explanation.
RESONATE – Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
RESPOND – Lies not immediately refuted become the audience’s truth.
INVESTIGATE – Collect and analyze their propaganda to understand their message, target audience & objectives.
SOURCE – Expose covert sources of false propaganda.
REASON – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  •    REASON #8a – Logical Fallacies
  •    REASON #8b – Cognitive Biases
  •    REASON #8c – Continued Influence Effect of Misinformation
  •    REASON #8d – Debiasing Misinformation – Worldview and Backfire

DISSEMINATE – Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

Citations
[1] Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner

3. Adapted from Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

4. Ellul, Jacques (1973). Propaganda: The Formation of Men’s Attitudes (Reprinted ed.). New York: Vintage Books. ISBN 978-0-394-71874-3. Cited by Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

5. Adapted from Wikipedia – Cognitive Bias. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Cognitive_bias

6. Wikipedia’s List of Cognitive Biases contains 190 items: 113 decision-making, belief & behavioral biases, 28 social biases, and 49 memory errors and biases. Fear not; look them up. Retrieved and adapted 2-4-19 from: https://en.wikipedia.org/wiki/List_of_cognitive_biases

Additional Reading
What is the difference between framing and priming effects? Gilman, Jeff. (3-26-16). Quora.com
https://www.quora.com/What-is-the-difference-between-framing-and-priming-effect

 

 

 

 

 

 

 

Reason – Counterpropaganda Principle #8a

Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.

  • Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
  • Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
  • Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
  • Propaganda targets emotional reactions, not cognitive reasoning. [1]
  • Counterpropaganda must target emotions as well as reason. [1]
  • Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
    • Continued Influence Effect
    • Familiarity Backfire Effect
    • Overkill Backfire Effect
    • Worldview Backfire Effect

This the first of four posts pertaining to REASON – Counterpropaganda Principle #8: Logical Fallacies, Cognitive Biases, Continued Influence Effect of Misinformation, Debiasing Misinformation – Worldview and Backfire.

Propagandists Twist Logic

Centuries ago, logic was considered essential to the well-educated man and reasoning and debate were sports. Of the 256 possible logical syllogistic structures, only fifteen were valid and were so familiar to the educated that they bore names such as Cesare, Dalisi and Bokardo. Those days are long gone. Most modern people cannot discern valid arguments from invalid. For example:

Argument 1 Argument 2
Premise 1:    All Greeks are mortal. Premise 1:   Some people are bad.
Premise 2:    Robert is a Greek. Premise 2:   All illegals are people.
Conclusion:  Robert is mortal. Conclusion: All illegals are bad.

The validity of argument 1 is straightforward; you can substitute fruit and it still works. If all fruit taste good, and all apples are fruit, then all apples taste good. That argument 2 is invalid is not so obvious, so we’ll again use fruit. All apples are fruit, and some fruit tastes bad, but you cannot validly say that therefore all apples taste bad. You can’t validly say that some or even any apples necessarily taste bad. They might, they might not; you don’t know. Argument 2 may sound valid, especially when it comes from a fast-talking liar, but it falls apart under examination.

Common Logical Fallacies [3]

Because we are fundamentally ignorant of logic and the common logical fallacies, propagandists often make false and misleading statements which we never notice. There are well over one hundred known logical fallacies (Master List of 143 Logical Fallacies ). We have prepared a PDF list of the Forty Most Common Logical Fallacies. Many of these have been known for centuries and carry Latin names. Here are a few:

APPEAL TO EMOTION Argumentum ad misericordiam (Informal Fallacy > Red Herring): You attempt to manipulate an emotional response in place of a valid or compelling argument. Appeals to emotion include appeals to fear, envy, hatred, pity, pride, and more. It’s important to note that sometimes a logically coherent argument may inspire emotion or have an emotional aspect, but the problem and fallacy occurs when emotion is used instead of a logical argument, or to obscure the fact that no compelling rational reason exists for one’s position. Everyone, bar psychopaths, is affected by emotion, and so appeals to emotion are a very common and effective argument tactic, but they’re ultimately flawed, dishonest, and tend to make one’s opponents justifiably emotional.
Example: Luke wouldn’t eat raw sheep’s brains with brussel sprouts, but his father told him to think about the poor, starving children in a third-world country who had no food at all.

ATTACK THE PERSON ad hominem  “To the man” (Informal Fallacy > Red Herring > Genetic): You attack your opponent’s character or personal traits in an attempt to undermine their argument. Ad hominem attacks can take the form of overtly attacking somebody, or more subtly casting doubt on their character or personal attributes as a way to discredit their argument. The result of an ad hominem attack can be to undermine someone’s case without actually having to engage with it.
Example: “Little Marco Rubio!” “Don’t believe what she says because she smells weird and isn’t married.”

BEGGING THE QUESTION argumentum petitio principia  (Informal Fallacy): You present a circular argument in which the conclusion was included in the premise. This logically incoherent argument often arises in situations where people have an assumption that is very ingrained, and therefore taken in their minds as a given. Circular reasoning is bad mostly because it’s not very good.
Example: The word of Zorbo the Great is flawless and perfect. We know this because it says so in The Great Infallible Book of Zorbo’s Best and Truest Things that are Definitely True and Should Never Be Questioned.

CHERRY-PICKING THE DATA (Card-Stacking, Controlling the Microphone, Ignoring Counterevidence, One-sided Assessment, Quoting out of Context, Proof-Texting, Slanting, Suppressed Evidence)(Informal Fallacy > One-Sidedness > Quoting Out of Context): Selecting only data which suits your argument, or finding a pattern which fits your presumption, while ignoring contradictory data or patterns. When you know your goal or the numbers you have, it’s easy to concoct agreeable explanations for them.
Example: In his review of cigarettes and lung cancer studies, tobacco company employee John left out all studies that supported the cigarette-cancer link and used only studies that found no link.

COMPOSITION/DIVISION (Informal Fallacy): You assume that one part of something has to be applied to all, or other, parts of it; or that the whole must apply to its parts. Often when something is true for the part it does also apply to the whole, or vice versa, but the crucial difference is whether there exists good evidence to show that this is the case. Because we observe consistencies in things, our thinking can become biased so that we presume consistency to exist where it does not.
Example: Daniel, a precocious child, liked logic. He reasoned that because atoms are invisible, and because he was made of atoms, he was therefore invisible too. Unfortunately he still lost at hide-and-go-seek.
Example: “These three men are convicted rapists, so all men are rapists.”

EQUIVOCATION (Informal Fallacy > Ambiguity): Equivocation is a type of ambiguity in which a single word or phrase has two or more distinct meanings, which contrasts with amphiboly, which is grammatical ambiguity. The reason that it qualifies as a fallacy is that it is intrinsically misleading.
Example: The defendant told the judge that he shouldn’t have to pay the parking fine because the sign said ‘Fine for parking here’ and so he naturally presumed that it would be fine to park there.
Example: “Evolution is just a theory!” In science, “theory” means a coherent group of propositions used as principles of explanation for a class of phenomena; a theory is an overarching framework which provides an explanation for observations and a guide for hypothesis-testing. Colloquially, it can mean any wild idea that pops into your head. “I have this theory about why space aliens built the pyramids.” In both cases, the speaker is twisting the original meaning of the word.

NO TRUE SCOTSMAN (Informal Fallacy > Ambiguity): You make what could be called an appeal to purity as a way to dismiss relevant criticisms of flaws in your argument.
In this form of faulty reasoning one’s belief is rendered unfalsifiable because no matter how compelling the evidence is, one simply shifts the goalposts so that it wouldn’t apply to a supposedly ‘true’ example. This kind of post-rationalization is a way of avoiding valid criticisms of one’s argument.
Example: Angus declares that Scotsmen do not sugar their porridge, to which Lachlan points out that he is a Scotsman and sugars his porridge. Furious, Angus shouts that no true Scotsman puts sugar on his porridge.
Examples:
Real men don’t eat quiche.” “No honest, patriotic American would even think to question the President’s decision.”

PERSONAL INCONSISTENCY tu quoque “And you too!” or “What about you?” (Informal Fallacy > Red Herring> Two Wrongs Make a Right): You avoid engaging with criticism by turning it back on the accuser – you answered criticism with criticism.
Tu quoque is also known as the appeal to hypocrisy. It is commonly employed as an effective red herring because it takes the heat off someone having to defend their argument, and instead shifts the focus back on to the person making the criticism. “I’m not a crook but you are, you crook!”
Example: Nicole identified that Hannah had committed a logical fallacy, but instead of addressing the substance of her claim, Hannah accused Nicole of committing a fallacy earlier on in the conversation.

STRAW MAN argumentum ad logicam (Informal Fallacy > Red Herring): You misrepresent someone’s argument to make it easier to attack, as a boxer might make a figure out of straw, punches it to pieces, then declare glorious victory over the actual human foe. By exaggerating, misrepresenting, or completely fabricating someone’s argument, it’s much easier to present your own position as being reasonable, but this kind of dishonesty undermines honest rational debate.
Example: After Bob says that we should put more money into health and education, Tom responds by saying that he was surprised that Will hates our country so much that he wants to leave it defenseless by cutting military spending.

We have a page listing of the Forty Most Common Logical Fallacies. We also have a PDF printout (8 pages) HERE. In addition to the fallacies cited above, it includes: Appeal to Force, Amphiboly, Anecdotal, Appeal to Authority, Appeal to Ignorance, Appeal to Nature, Bandwagon, Black-or-White, Burden of Proof, Coincidental Correlation, Common Sense, Does Not Follow, Fallacy Fallacy, False Analogy, False Equivalence, False Cause, Gambler’s Fallacy, Genetic, Hasty Generalization,  Loaded Question, Middle Ground, Moving the Goalposts, Novelty, Personal Inconsistency, Personal Incredulity, Red Herring, Reification, Rhetorical Question, Sincerity, Slippery Slope, Special Pleading and Texas Sharpshooter.
Note: All entries include examples, some of which were considerately supplied by Ms. Clinton and Mr. Trump.

This is the eighth installment (part a) in our series on counterpropaganda.

Other reports and items of interest:
The Nine Principles of Propaganda begins HERE.
Trump – Our Psychopathic President begins HERE.
For a double-sided PDF copy of the principles of propaganda and counterpropaganda go HERE.
For a double-sided PDF copy of the twelve criteria of psychopathy go HERE.

THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind

TRUTH
– Honest opposition is practical, moral, and unbiased.
FOCUS – Address only one or at most two points.
CLARITY – Easily understood without further explanation.
RESONATE – Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
RESPOND – Lies not immediately refuted become the audience’s truth.
INVESTIGATE – Collect and analyze their propaganda to understand their message, target audience & objectives.
SOURCE – Expose covert sources of false propaganda.
REASON – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
   REASON #8a – Logical Fallacies
   REASON #8b – Cognitive Biases
   REASON #8c – Continued Influence Effect of Misinformation
   REASON #8d – Debiasing Misinformation – Worldview and Backfire
DISSEMINATE – Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.

Citations
1. Wikipedia – Counterpropaganda. Retrieved 1-26-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors

2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner

3. Logical Fallacy Sources
The fallacies above are from our longer list of 40 Logical Fallacies (LINK) which were drawn from the following sources, reworded to varying degrees, reorganized and blended.

Your Logical Fallacy is – The School of Thought. One of two initial sources. The source for much of the text. They have a great poster of their 24 most important logical fallacies, and their website is very user-friendly. They also have a page and poster on 24 Cognitive Biases at
Your Cognitive Bias is includes: anchoring, confirmation, belief, self-serving and 20 other biases.
Fallacy Files – Fallacy Files.org, Gary N. Curtis. Source of the fallacy categorizations (e.g. Informal Fallacy > Red Herring > Emotional Appeal), also text, examples, Latin names.
Fallacies in Latin – TopWord.net. Latin names not found elsewhere.
List of Fallacies – Wikipedia. Some fallacies and text.
Logical Fallacies – Lexiconic.com .  Some fallacies and text.
Logical Fallacies Handlist – Kip Wheeler at Carson-Newman University.  Some fallacies, text, Latin names and inspiration for visual presentation appearance.
Spotting Logical Fallacies this Election Season (8-19-16) Hey Girl Communique,com.  One of two initial sources.

Some Additional Sources on Fallacies
Fallacies – Internet Encyclopedia of Philosophy. Dowden, Bradley – CSU Sacramento. Alphabetical list of 224 fallacies includes duplicates under alternate names, plus a lengthy discussion.
Fallacies in Latin – Changing Minds.org. 26 fallacies.
Logical Fallacies and the Art of Debate – (Jan. 2001). California State University at Northridge. Short discussion and 21 fallacies.
Master List of Logical Fallacies – (Jan. 2018). 143 fallacies from University of Texas at El Paso

Additional Reading
Introduction to Logic: Venn Diagrams and Categorical Syllogisms:  https://philosophy.lander.edu/logic/syll_venn.html
Understand Venn diagrams is a wonderful aid to understanding syllogisms.