The conflicting messages are everywhere – from the efficacy of vaccines, to the accuracy of election results, to the threat of climate change. Scrolling through your social media feed or listening to a family member express a belief you think is outrageous, you may have wondered how people can “fall for that.” Or maybe you’re just overwhelmed and confused.
The implications of disinformation on our planet, our democracy, and our collective health are huge. To help combat this problem, an anonymous donor has given a sizable gift to the College of Social and Behavioral Sciences. He hopes his donation inspires others to give to this important topic and advances the university’s effort to teach people how to recognize and resist what he calls nefarious propaganda and psychological manipulation.
The gift will fund research and the creation of instructional innovations by Jonathon Reinhardt, professor in the Department of English, and Diana Daly, assistant professor in the School of Information, on topics ranging from the grammar underpinning disinformation to the social relationships that make combating disinformation so challenging.
Resisting Psychological Manipulation
A University of Arizona alumnus, the donor lost family members to the Holocaust and taught himself how to recognize and resist propaganda.
“I believe that as a country we have failed to teach our people how to recognize and resist propaganda and as a result, many of our citizens have fallen for an onslaught of very effective psychological manipulation,” the donor said. “In my mind, this has greatly contributed to our inability to effectively address many of our major societal challenges, including racism and racial injustice, climate disruption, access to affordable health care, wealth inequality, and efforts to destabilize our democracy.”
The donor believes that people need to learn to spot various rhetorical strategies that people use when they are trying to deceive, including false choices, logical fallacies, false equivalence, cherry picking, manufactured outrage, fear mongering, scapegoating, hate speech, and character assassination.
“We all have inherent cognitive biases and tend to seek simple ways of understanding our world which hide or ignore complexity. Effective propaganda takes advantage of this,” he said. “It provokes us to react emotionally rather than think critically with empathy and understanding.”
“I believe that having familiarity with psychological manipulation (PM) techniques provides crucial insight into the trustworthiness of a speaker or author – it’s generally clear who is arguing in good faith and who is relying on PM to make their case,” he said. “If you understand the con, the con doesn’t work.”
The Clarify Initiative
Reinhardt’s project is titled “The Clarify Initiative,” which includes the curation and creation of instructional materials, including digital resources that integrate critical language awareness into grammar and language instruction.
Reinhardt said that traditional grammar textbooks often do not explain how grammar is used as a tool for rhetoric – from using hyperbole, superlatives, and passive voice, to the strategic use of the pronoun “we” by politicians.
“Grammar isn’t some sort of neutral tool, and every message is used for something,” Reinhardt said. “We can use various linguistics tools to analyze discourse or texts – from political speeches to advertisements to news reports to TikTok videos – to understand the intention of the author.”
This semester, he’s revised a section of “English 406: Modern English Grammar” to include discussion of rhetoric, propaganda, and misinformation. Reinhardt also works with large databases of text to analyze the frequency of words and how they’re used with other words and in what particular contexts.
“We can use some of these findings from corpus analyses to empower students and help them develop critical awareness of how language is used,” Reinhardt said.
As part of the Clarify Initiative, Reinhardt will create and share a variety of educational materials and resources for use in high schools, community colleges, and universities. Products will include an open-source e-textbook and e-workbook, a free smartphone app, and a prototype for an educational simulation role-playing game.
“I’m extremely grateful for the opportunity and the gift of time to innovate and create curriculum that the world needs,” Reinhardt said. “The project can help people develop awareness towards how media is used to influence us and how we can be more active in understanding it.”
Daly’s project is titled “Immersive Truth: Using Groups and Stories to Construct and Deconstruct Propaganda” and will involve identification of narrative and social strategies behind the spread of disinformation via rich media. She will also pilot creative media solutions with students.
“For a long time, there’s been a focus on making sure people have the facts, but what we are encountering is that facts are not enough,” Daly said. “The narratives that people form are really essential for them to make sense of things that are happening. They are not easily subtracted from people’s minds. Facts don’t do it. The only way to subtract a narrative, I believe, is to replace it with another narrative.”
Daly said a desire for connection and a shared sense of purpose is a driving force behind the spread of both information and misinformation online.
“In an era when misinformation communities like QAnon mirror the behavior of religious groups, a crucial understanding we lack is of the purposes misinformation communities serve for followers who, by continuing to believe, continue to belong,” Daly said.
Daly’s project will include pilot research focused on wellness communities spreading anti-vaccine beliefs and behaviors. Her team will analyze viral misinformation memes, trace the source, and analyze the appeals. They will interview believers and nonbelievers. The team will also try to recruit concerned members of these communities to produce and spread memes and messaging that question, satirize, or debunk the misinformation.
Outcomes of the project include reporting findings online, along with creating various teaching tools, such as memes, digital stories, and apps to disentangle disinformation.
“I’m really grateful to have been matched through SBS with a donor who is thinking deeply about what I’m thinking deeply about, which is how to address this extraordinary and growing problem of bad information circulating in our social environments,” Daly said.
This story was included in the spring 2022 Developments newsletter.
Did you know?
Researchers use various terms for problematic information, including misinformation (false information) and disinformation (intentional false information).