Some bad ideas get past a mind’s defenses and then hijack the mind’s immune system. These bad ideas recruit the mind’s defenses to protect themselves, even if that recruitment ends up harming the mind that hosts it. This process is similar to what happens with metastatic cancer, which spreads from one location in the body to a distant one. Metastatic cancer “flips” elements of the body’s immune system, recruiting them to defend tumors and attack the body. When bad ideas hijack the mind’s immune system, these bad ideas become resistant to correction and the mind becomes susceptible to more bad ideas. Bad ideas spread in the mind and can eventually take it over.

Mental immune systems can be healthy or unhealthy, functioning or nonfunctioning, uncompromised or compromised, resilient or fragile. The mind’s immune system develops over time and is shaped by nature and nurture. Physical impairments, such as damage to brain tissue, can affect the mind’s ability to ward off bad ideas or to self-correct, but many people with damaged brain tissue can ward off bad ideas. Furthermore, physical impairments do not explain all, or even most, cases in which people come to hold bad ideas. Environmental factors play a huge role in damaging mental immune systems and making people more vulnerable to bad ideas.

Once a bad idea has recruited the mind’s immune system to protect itself, we should consider it to be a pathological belief. A pathological belief is one that “is likely to be false, to produce unnecessary harm, and to be held with conviction and tenacity in the face of overwhelming evidence that it is likely false and will produce unnecessary harm” (Mauer 2022, 324).

Each individual person, as well as groups and institutions, have mental immune systems. A person with a hijacked immune system can develop a delusional disorder, a condition in which the person holds a false belief in the face of overwhelming contrary evidence over a sustained period of time. A group with a hijacked mental immune system is a cult; in a cult, the group’s mental immune system is altered to protect the group’s bad ideas at the expense of the group’s wellbeing, the wellbeing of its members, and of those around it.

Mental immune systems have two parts: internal (cognitive) and external (physical). We might think of external barriers as gatekeepers that filter information, keeping false and harmful information, such as disinformation, from getting through to pollute the information sphere (Bjola and Papadakis). Gatekeepers include peer reviewers in scholarly work and fact-checkers in journalism. Internal barriers are people’s belief assessment tools. In a person with a healthy mental immune system, these assessment tools get deployed to challenge and reject bad ideas.

Inoculation theory explains the ways that our internal immune systems can be boosted. People can be taught to use belief assessment tools effectively, and they can practice assessing and rejecting bad beliefs. The inoculant is typically a weakened form of a bad idea. Once a person has practiced assessing and rejecting the weakened form of the bad idea, they are better prepared to reject a stronger (more cohesive, more compelling) presentation of the bad idea.

People with compromised mental immune systems do the opposite; they adopt bad ideas and reject good ones. Cult leaders use inoculation methods to train their followers to reject challenges, just as pathogens learn and adapt to resist antibodies. In a healthy person, the mental immune system produces a swarm of reasons to support healthy beliefs and attack pathological ones.

Mental immunology as an idea has been around for a long time. Prudence, a classical virtue, is the ability to know how to discern the true, the ethical, and the efficacious. A prudent person is one who will reject bad ideas. The eighteenth-century philosopher Immanuel Kant detailed the different kinds of judgment we use, which include differentiating true from false, right from wrong, and pleasure from pain.

Our beliefs relate to all three of the judgment dimensions Kant identifies. Take for example the QAnon belief that “Hillary Clinton runs a pedophile ring out of a pizza parlor basement and that the COVID-19 pandemic is caused by 5G cell phone towers as part of a plot to depopulate the earth” (Mauer 2022, 323). Within QAnon, these beliefs are held with conviction despite the overwhelming evidence that they are false, that it is ethically wrong to believe in such falsehoods, and that they cause harm. Yet QAnon flips judgment on its head, declaring the false to be true, wrong to be right, and the pain that results to be necessary (and is even pleasurable for those inflicting it).

QAnon is an institution – with a leader, communication organs, rituals, emblems, etc. – that functions to subvert the mental immunity of its adherents. It induces mental illness by hijacking people’s reasoning ability into defending pathological beliefs and attacking healthy ones. Once someone such as a QAnon follower adheres to a pathological belief, they are more likely to adhere to other pathological beliefs. Ted Goertz, et al, write, "People who believed in one conspiracy were more likely to also believe in others" (1994, 731). Alan Bensley states, "It has been known for some time that people who tend to accept one false conspiracy theory, such as the claim that the 911 attack was an inside job, are also more likely to accept others, as well" (quoted in Dolan 2019). Bensley adds, "We found that measures of generic conspiracist ideation, specific fictitious conspiracy theory, and false conspiracy theory beliefs were all strongly and positively intercorrelated" (Bensley et al. 2020, 16).

What makes us hold to some beliefs and resist others? One explanation comes from Robert Cialdini in Influence: The Psychology of Persuasion. Cialdini states that beliefs have “legs” (88). In his model, he shows a drawing of a table with the belief being the top and the reasons supporting it being the legs. Remove enough legs and a small sideways force will cause the table to fall over (and the belief to collapse). A belief with multiple reasons supporting it is more likely to resist challenges than a belief with fewer reasons. A belief with one very sturdy reason – like a table with a pedestal – is also able to resist a lot of contrary pressure. The reasons we use to uphold our beliefs may be good or bad ones (we call the bad reasons rationalizations). Dan Kahan has pointed out how people protect beliefs that they think determine their self-worth. This Identity Protection Cognition can serve as a very sturdy support for beliefs, regardless of whether those beliefs are good or bad.

Cialdini’s table metaphor has some explanatory value, I believe, but we don’t usually see dinner tables grow new or sturdier legs; nor do we see them lose legs all that often. Our mental immune system, in other words, is more dynamic than the table metaphor suggests. Mental immunity more closely resembles the body’s immune system in which cells and pathogens develop new offensive and defensive capabilities made of proteins and enzymes. In other words, a more accurate model for explaining the way our minds process belief is one that includes evolutionary adaptations.

Evolution doesn’t produce only adaptive changes; it also produces maladaptive ones. Changes are not inherently adaptive or maladaptive but are so only in relation to the environment. What’s adaptive in one environment is not adaptive in another. Similarly, in terms of human mental immunity, a belief can be adaptive in one social setting and not in another. A group may adopt beliefs that benefit itself but at the expense of other groups in the society. A belief in racial superiority might qualify as such a belief.

Human beings are, to a great extent, responsible for creating our environments, which then shape the beliefs, attitudes, and behaviors that are rewarded and punished. We establish environments through our institutions and through them we practice (or don’t practice) things like the search for the true, the good, and the beautiful. Therefore, rather than seeking to control each individual person, our efforts are better spent shaping institutions that establish the healthy environments we want, which will in turn encourage beliefs, behaviors, and attitudes that are adaptive for these environments. An ounce of prevention (creating healthy environments) is worth a pound of cure (dealing with the consequences of pathological beliefs, behaviors, and attitudes).

Institutions such as schools and journalism play outsized roles in enhancing or corrupting the mental immune systems of individual people and of groups. By focusing attention on some issues and not others, they create significance and relevance in the mind. For instance, journalism can focus more or less attention on the recent report that wildlife populations have crashed by 69% over the past 50 years (Davis 2022) or on Hunter Biden’s laptop. Which of these stories has more people talking? Which increases salience (emotional investment)? Which triggers people to protect their identities by accepting or rejecting various claims? Perceptions of relevance and significance (which relate to our attention) shape our beliefs and behaviors.

Healthy institutions are responsive to changes in the information environment. They have the tools to diagnose pathological beliefs and ignorance in the population and to treat such conditions by counteracting misinformation, disinformation, and dismediation (the discrediting of credible sources). They are able to serve as gatekeepers, crediting and promoting true and ethical beliefs and filtering pathological ones from further infecting a population. Most importantly, they understand how the mental immune system works and institute practices designed to make mental immune systems (in individual people and in groups) healthier, more resilient, and more robust. The war on bad ideas will never be over, but it doesn’t have to be doomed to failure.

References:

Bensley, D. A., Scott O. Lilienfeld, Krystal A. Rowan, Christopher M. Masciocchi, and Florent Grain. 2020. "The Generality of Belief in Unsubstantiated Claims." Applied Cognitive Psychology 34, no. 1: 16-28. doi:10.1002/acp.3581.

Bjola, Corneliu and Krysianna Papadakis. 2020. "Digital Propaganda, Counterpublics and the Disruption of the Public Sphere: The Finnish Approach to Building Digital Resilience." Null 33, no. 5: 638-66. doi:10.1080/09557571.20 19.1704221.

Cialdini, Robert B. 1984. Influence: The New Psychology of Modern Persuasion. New York: William Morrow.

Davis, Josh. 13 October 2022. “Wildlife populations have crashed by 69% within less than a lifetime.” Natural History Museum. Accessed October 16, 2022.  https://www.nhm.ac.uk/discover/news/2022/october/wildlife-populations-crashed-by-69-within-less-than-a-lifetime.html

Dolan, Eric W. 2019. "Study: Conspiracy Theory Believers Tend to Endorse Other Unsubstantiated Beliefs as Well." PsyPost, July. Accessed June 1, 2020. https:// www.psypost.org/2019/07/study-conspiracy-theory-believers-tend-to-endorseother- unsubstantiated-beliefs-as-well-54151.

Goertzel, Ted. 1994. "Belief in Conspiracy Theories." Political Psychology 15, no. 4 (December): 731-42. doi:10.2307/3791630.

Mauer, Barry. 2022. “The Cognitive Immune System: The Mind’s Ability to Dispel Pathological Beliefs.” Global Modernity in the Shadow of Pandemic: A Cross-Disciplinary Update. Eds. Hatem Akil and Simone Maddanu. Amsterdam University Press.