From the inflexible will to fight for one’s empire of Nazi SS or ISIS combatants, to devoted Muslims, Jews and Christians vis-à-vis their respective Holy books and Jerusalem, to Western leftists with respect to political equality or “White supremacists” to racial purity, accumulating evidence since Durkheim1 suggests humans evolved propensities to rigidly relate to, or absolutize, core moral imperatives, especially when group memberships and identities are made salient by group conflict. Whether they lead to political violence or not, those phenomena are ubiquitous and raise many questions, to which the burgeoning sciences of morality and extremism have started giving preliminary answers.

The mind is a collection of specialized, “subpersonal”2 information processing systems, calibrated by natural selection to trigger the cognitions and behaviors that, over evolutionary time, increased reproductive success in the social and natural environments our forager ancestors navigated.3 What I call moral rigidity or absolutization seems to combine and rely upon at least two such evolved, and perhaps complementary, systems. The first is sacralization, the motivational propensity to imbue norms and objects with significance that is all out of proportion to their practical utility. Another is moral objectivism, the predisposition to conceive of sacralized prescriptions as being like “facts” of nature, externally imposed on all human minds, and grounded in a transcendent reality or entity (e.g. God).4, 5

What is puzzling about moral rigidity is that it’s as though its objects had to be honored intrinsically, without much regard for consequences and opportunity costs. In blatant contrast with economic wisdom to which all things in the world must ultimately take on a finite value and be mutually fungible, tradeoffs between sacred and mundane things, whether real or imagined, are treated as taboo, morally contaminating, or outrageous.6-8]

Moral rigidity is a mental function, susceptible to operate on a wide range of commitments. To the extent that they remain compatible with an encompassing conception of humanity and human flourishing, absolutizing certain moral values or ideals may be a good criterion of moral trustworthiness. You would probably never accept to shoot an innocent fellow citizen or child, renounce the freedom of the press, or sell the Brazilian rainforest to cynical lumberjacks, even for all the tea in China.

Moral rigidity, which may be part of our innate group or coalitional psychology,10] can become dangerous when it fosters or gets recruited by (as the causality may go both ways) parochial tendencies that clearly undermine the common good. It naturally tends to fixate on the norms, resources, and shared narratives that are collectively treated — sometimes rather arbitrarily — as key for group cooperation and survival and as definitional to its identity and integrity in intergroup rivalry. Moral rigidity typically increases the saliency of in-group/out-group boundaries, triggers overconfidence towards one’s-group, and close-mindedness and spite towards rival groups. Disagreements about which values and goals are treated as sacred and objectified have a starring role in sustaining many contemporary conflicts, from political polarization among Democrat and Republican supporters and politicians11, 12 to intractable wars in the Middle East.13 For example, when systems of sacralization and moral objectification in Palestinian and Israeli minds become culturally incentivized to compute their respective groups' aspirations to settle in Palestine as non-negotiable — “By will of God, this strip of land is just ours, and it is priceless” — no material compensation, whether by the other side or a third party, seems able to assuage them. Sometimes, proposals to compromise may make things worse.13 What utilitarians would regard as seeking a middle ground between equally valid claims will be interpreted by sacralizers as attempts at ‘corrupting’ them and their supporters, and be met with heated outrage. Brain imaging studies even suggest that representation of sacred values and inflexible grammar rules have common neural bases.14

To date, no encompassing theory has proposed to connect sacralization and moral objectivism together by viewing them as the complementary outcomes of a common evolutionary dynamic. Drawing on accumulating, though relatively scattered evidence, I argue that moral rigidity and its intimate link to in-group boundaries may have evolved so as to make us behave, and be seen, as trustworthy yet cautious team members in social environments mired by intergroup competition.15 In short, the story may be the following. Regardless of the specific social organizations and ecologies our ancestors navigated, social life basic challenges were the same over the millennia. Individuals had to (a) secure long-term support from the most cooperative allies by demonstrating commitment to them (which includes meting out “third party punishment”), while at the same time (b) avoiding being personally exploited by hostile in-group and out-group members.16 From a decision-making point of view, each challenge would have entailed a specific type of error to absolutely never make.17 With respect to (a), it would have been much better for one's reputation to forgo occasional opportunities to rest or keep a few resources for oneself by being ‘over-devoted’ to the community’s activities, than being kicked out of it for having been insufficiently involved. And as regards (b), erring on the side of excessive touchiness by ‘over-detecting’ cues of potential exploitation, even when none is present, would have been much wiser than paying the ultimate price of ending up with betrayers. Furthermore, the imperative to avoid both errors would have been all the more dramatic as interdependence within groups would have been high, as in tribal warfare or harsh natural environments.

There are many reasons to believe that the task of figuring out the right move in games of cooperation — collaborate? distrust? — cannot be entirely left in the hands of conscious calculations. Cognitive mechanisms must be explained by adopting the long-term perspective of their cumulated effects over an organism’s lifetime, which includes their inherent limitations. Opportunistic decisions to commit that rely on effortful reflection would have sometimes been at a disadvantage with cooperative instincts when it comes to eliminating vigilant partners' doubts that they might get screwed one day or another. Besides, brain computations aren’t magical. They are slow, and couldn’t possibly anticipate the consequences of a selfish act with perfect accuracy. Hence, there is growing consensus that selection should also have favored automatic intuitions to do the job of compelling us to do the right thing without thinking twice about it.18-21

Processes of sacralization and objectification of moral prescriptions are of this type. More specifically, they make us perceive what ultimately are mutual, interpersonal obligations between partners within a group as absolute coordinates of reality. By making it very hard to even conceive of the possibility that key social obligations not be honored, blinding minds to opportunity costs, and triggering instinctive outrage22 and willingness to punish encroachments on their objects,23 those intuitions may simultaneously constitute (a) honest signals of trustworthiness and (b) automatic protection barriers against potential mistreatments. In other words, their adaptive magic may be to hit two birds with one stone: motivating costly commitment and self-defense at the same time, with parsimony, simply by absolutizing one impartial rule of conduct per domain (e.g. “You shalt be loyal to your comrades”), and without any conscious reason for doing so.

References:

  1. Durkheim, E. (1915/1965). The Elementary Forms of the Religious Life. The Free Press, New York.
  2. McKay, R., & Dennett, D. (2009). The evolution of misbelief. Behavioral and Brain Sciences, 32(6), 493–561.
  3. Tooby, J. & Cosmides, L. (1992) The psychological foundations of culture. In: The Adapted Mind: Evolutionary Psychology and the Generation of Culture, ed. Barkow, J., Cosmides, L. & Tooby, J., pp. 19–136. Oxford University Press.
  4. Goodwin, G. P., & Darley, J. M. (2012). Why are some moral beliefs perceived to be more objective than others? Journal of Experimental Social Psychology,48(1), 250–256. https://doi.org/10.1016/j.jesp.2011.08.006
  5. Stanford, K. (2018). The Difference Between Ice Cream and Nazis: Moral Externalization and the Evolution of Human Cooperation, Behavioral and Brain Sciences, 2017, Jul 6;1-57. doi: 10.1017/S0140525X17001911
  6. Baron, J., & Spranca, M. (1997). Protected values. Organizational Behavior and Human Decision Processes, 70, 1-16.
  7. Gómez, Á., López-Rodríguez, L., Sheikh, H., Ginges, J., Wilson, L., Waziri, H., … Atran, S. (2017). The devoted actor’s will to fight and the spiritual dimension of human conflict. Nature Human Behaviour, 1(9), 673–679. https://doi.org/10.1038/s41562-017-0193-3
  8. Graham, J., & Haidt, J. (2012). Sacred values and evil adversaries: A moral foundations approach. In M. Mikulincer & P. R. Shaver (Eds.), Herzliya series on personality and social psychology. The social psychology of morality: Exploring the causes of good and evil (pp. 11-31). Washington, DC, US: American Psychological Association.
  9. Tetlock, P. E. (2003). Thinking the unthinkable: Sacred values and taboo cognitions. Trends in Cognitive Sciences, 7(7), 320–324. https://doi.org/10.1016/S1364-6613(03)00135-9
  10. Tooby, J. & Cosmides, L. (2010) Groups in mind: The coalitional roots of war and morality. In: Human morality and sociality: Evolutionary and comparative perspectives, ed. H. Høgh-Olesen, pp. 191–234. Palgrave MacMillan.
  11. Frimer, J., Motyl, M., & Tell, C. (2016). Sacralizing liberals and fair-minded conservatives: Ideological symmetry in the moral motives in the culture war. Analyses of Social Issues and Public Policy 17, 33-59.
  12. Van Bavel, J. J., & Pereira, A. (2018). The Partisan Brain: An Identity-Based Model of Political Belief. Trends in Cognitive Sciences, 22(3), 213–224. https://doi.org/10.1016/j.tics.2018.01.004
  13. Atran, S., Axelrod, R. & Davis, R. (2007). Sacred barriers to conflict resolution. Science 317, 1039–1040. (doi:10. 1126/science.1144241)
  14. Berns, G.S., Bell, E., Capra, C. M., Prietula, M. J., Moore, S.,  Anderson, B., Ginges, J., and Atran, S. (2012). The price of your soul : neural evidence for the non-utilitarian representation of sacred values. Phil. Trans. R. Soc. B 367, 754–762 doi:10.1098/rstb.2011.0262
  15. Marie, A., & Fitouchi, L. (in preparation). The evolution of moral rigidity.
  16. Baumard, N., André, J. B., & Sperber, D. (2013). A mutualistic approach to morality : The evolution of fairness by partner choice. Behavioral and Brain Sciences, 59–122. https://doi.org/10.1017/S0140525X11002202
  17. Haselton, M. G., Nettle, D., & Andrews, P. W. (2015). The Evolution of Cognitive Bias. In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology (pp. 724–746). Hoboken, NJ, USA: John Wiley & Sons, Inc. https://doi.org/10.1002/9780470939376.ch25
  18. Frank, R. (1988) Passions within reason: The strategic role of the emotions, vol. 1. Norton.
  19. Delton, A. W., Krasnow, M. M., Cosmides, L. & Tooby, J. (2011) Evolution of direct reciprocity under uncertainty can explain human generosity in one-shot encounters. Proceedings of the National Academy of Sciences USA 108:13335– 40.
  20. Jordan, J. J., Hoffman, M., Nowak, M. A., & Rand, D. G. (2016). Uncalculating cooperation is used to signal trustworthiness. Proceedings of the National Academy of Sciences, 113(31), 8658–8663. https://doi.org/10.1073/pnas.1601280113
  21. Bear A, Rand DG (2016) Intuition, deliberation, and the evolution of cooperation. Proc Natl Acad Sci USA 113(4):936–941
  22. Delton, A. W., Nemirow, J., Robertson, T. E., Cimino, A., & Cosmides, L. (2013). Merely opting out of a public good is moralized: An error management approach to cooperation. Journal of Personality and Social Psychology, 105(4), 621–638. https://doi.org/10.1037/a0033495
  23. Jordan, J. J., Rand, D. G., Bloom, P., & Rand, D. (2016). Third-party punishment as a costly signal of trustworthiness. Nature, 421(7591), 189–202. https://doi.org/10.1016/j.jtbi.2017.04.004