Monday, November 20, 2023

Do clinical delusions have anything in common with a mythology mindset?

 


In my discussion of Steven Pinker’s book, Rationality, I referred to his observation that people tend to have a reality mindset in the world of immediate experience and a mythology mindset when discussing issues in the public sphere. Although that is an accurate observation about a general tendency, delusions are also fairly common in the world of immediate experience.

The delusions that most of us experience are fairly harmless. For example, it may not do you much harm to believe that you are happier than average, even if you aren’t. That common delusion may help to explain why so many people walk around with smiles on their faces.

For some unfortunate people, however, the world of immediate experience includes delusional beliefs that are symptomatic of mental ill-health. These are referred to as clinical delusions.


The question I ask above has been prompted by my reading of Lisa Bortolotti’s recent book, Why Delusions Matter. Lisa Bortolotti is a philosopher who specializes in the philosophy of the cognitive sciences, including issues relating to mental illness. She observes that there is a strong overlap between clinical and non-clinical delusional beliefs. The non-clinical delusional beliefs that she discusses include beliefs that Pinker would associate with a mythology mindset.

A conversation context

Bortolotti notes that in any discussion between two people, you have a speaker and an interpreter swapping roles as the conversation proceeds. The speaker says something and the interpreter listens, making inferences about the speaker’s beliefs, desires, feelings, hopes and intentions on the basis of the speaker’s words, facial expression, tone of voice, previous behaviour and so on.

Interpretation becomes challenging when the interpreter suspects that the speaker may be delusional. The interpreter rarely has the information needed to assess that the speaker’s beliefs are false, so falsity cannot be a necessary condition for attribution of delusionality.

Three elements are often involved when the interpreter judges the speaker to be delusional:

  • Implausibility: The interpreter finds the speaker’s beliefs to be implausible.
  • Unshakeability: Speakers do not give up their beliefs in the face of counterarguments and counterevidence.
  • Identity: The beliefs seem important to the image that speakers have of themselves.

Clinical delusions

Bortolotti offers what she describes as an “agency-in-context” model to explain clinical delusions. She explains:

“The adoption and maintenance of delusional beliefs are due to many factors combining aspects of who you are and what your story is (your genes, reasoning biases, personality, lack of scientific literacy, etc.) and aspects of how epistemic practices operate in the society where you live.”

The epistemic practices she refers to include what we learn at school about knowledge acquisition, and the stigma that makes it difficult for people with delusional beliefs to participate fully in public life.

There is no doubt that persecutory delusions are harmful to the speaker and others. They undermine the ability of speakers to respond appropriately to events, and often erode their relationships with others.

However, Lisa Bortolotti suggests that it is important for interpreters to understand that most delusions offer some benefits for speakers. Delusions “let speakers see the world as they want the world to be; make speakers feel important and interesting; or give meaning to speakers’ lives, configuring exciting missions for them to accomplish”.

Interpreters also need to understand that the underlying problems of speakers don’t disappear when they obtain insight about their delusions. They may become depressed when they approach reality without the filter of their delusional beliefs.

There is not much to be gained by attempting to reason with people whose beliefs are unshakeable. Bortolotti suggests that it is probably more productive for the interpreter and speaker to share stories rather than exchanging reasons for beliefs. Exchanging stories can show how delusional beliefs emerged as reactions to situations that were difficult to manage. While sharing stories, interpreters have opportunities “to practice curiosity and empathy in finding out more” about underlying problems.

Conspiracy delusions

From an interpreter’s viewpoint, a speaker’s beliefs about the existence of conspiracies often have similar characteristics to clinical delusions. They are implausible, unshakeable, and closely tied to the speaker’s self-image.

Bortolotti emphasizes that those who hold conspiracy delusions often claim to have special knowledge of events – they claim to be experts, or to know who the real experts are. Identifying as a member of a group is often also important. Non-members often refer to members of such groups in a derogatory way e.g. QAnon supporters and anti-vaxxers. However, people are often attracted to conspiracy delusions promoted by like-minded people whom they trust. The act of sharing a delusional story can be a signal of commitment to a particular group.

Comments

Lisa Bortolotti’s book has improved my understanding of delusions in a couple of different ways. First, it has given me a better appreciation that delusions offer some benefits to the people who hold them, and those benefits help to explain the unshakeability of delusional beliefs.

Second, viewing delusions within the context of a conversation between a speaker and an interpreter is helpful in drawing attention to the value judgements involved in assessing whether the speaker’s beliefs are delusional.

My main criticism of the book is that the author seems to me to be biased in favour of “the official version” of events, even though she acknowledges that contrary beliefs are sometimes vindicated. The most obvious example bias is her apparent reluctance to give credence to the possibility that Covid19 may have originated in a lab in Wuhan.

I am pleased that my reading of the book did not leave me with the impression that the author believes that it is delusional to have an unshakeable belief in the importance of the search for truth. In emphasizing that value judgements are involved in assessing whether beliefs are delusional, Lisa Bortolotti seems to me to be providing readers with a better understanding of the meaning attached to the concept of delusion in clinical and non-clinical settings, rather than casting doubt on the existence of reality.


Saturday, September 30, 2023

What's wrong with people?

 


This question is posed in the title of Chapter 10 of Steven Pinker’s book, Rationality: What it is, Why it Seems Scarce, Why it Matters.


I enjoyed reading the previous 9 chapters but didn’t learn much from them. Those chapters were a painless way to refresh my memory about definitions of rationality, rules of logic, probability, Bayesian reasoning, rational choice, statistical decision theory, game theory, correlation, and regression analysis.

I particularly liked the approach Pinker took in discussing the research of Daniel Kahneman and Amos Tversky which documents many ways in which people are prone to fall short of normative benchmarks of rationality. Pinker makes the point:

When people’s judgments deviate from a normative model, as they so often do, we have a puzzle to solve. Sometimes the disparity reveals a genuine irrationality: the human brain cannot cope with the complexity of a problem, or it is saddled with a bug that cussedly drives it to the wrong answer time and again.

But in many cases there is a method to people’s madness.”

A prime example is loss aversion: “Our existence depends on a precarious bubble of improbabilities with pain and death just a misstep away”. In Freedom Progress and Human Flourishing, I argued similarly that loss aversion helped our ancestors to survive.

Pinker doesn’t seek to blame the propensity of humans to make logical and statistical fallacies for the prevalence of irrationality in the public sphere. He is not inclined to blame social media either, although he recognises its potential to accelerate the spread of florid fantasies.

The mythology mindset

Pinker argues that reasoning is largely tailored to winning arguments. People don’t like getting on to a train of reasoning if they don’t like where it takes them. That is less of a problem for small groups of people (families, research teams, businesses) who have a common interest in finding the truth than it is in the public sphere.

People tend to have a reality mindset when they are dealing with issues that affect their well-being directly – the world of their immediate experience – but are more inclined to adopt a mythology mindset when they are dealing with issues in the public sphere.

When economists discuss such matters, they may refer to the observation of Joseph Schumpeter that the typical citizen drops to a lower level of mental performance when discussion turns to politics. They reference the concept of rational ignorance attributed to Anthony Downs and Gordon Tulloch. They may also refer to Brian Caplan’s concept of rational irrationality. (For example, see Freedom, Progress, and Human Flourishing, pp 114-115).

Pinker doesn’t refer to those economists’ perspectives but offers interesting insights about factors that might lead people to adopt mythology mindsets. In summary, as a consequence of myside bias, attitudes to the findings of scientific studies often have less to do with scientific literacy than with political affiliation. The opposing “sides” are sometimes akin to “religious sects, which are held together by faith in their moral superiority and contempt for opposing sects”. Within those sects the function of beliefs is to bind the group together and give it moral purpose.

What can we do?

Pinker’s suggestions for combatting irrationality in the public sphere are summed up by his subheading “Re-affirming Rationality”. He advocates openness to evidence, noting the findings of a survey suggesting that most internet users claim to be open to evidence. He suggests that we valorize the norm of rationality by “smiling or frowning on rational and irrational habits”.

Pinker identifies institutions that specialize in creating and sharing knowledge as playing a major role in influencing the beliefs that people hold. Since “no-one can know everything”, we all rely on academia, public and private research units, and the news media for a great deal of the knowledge which forms the basis of our beliefs. Unfortunately, these institutions are often not trustworthy.

In the case of the universities, Pinker suggests that the problem stems from “a suffocating left-wing monoculture, with its punishment of students and professors who question dogmas on gender, race, culture, genetics, colonialism, and sexual identity and orientation”. News and opinion sites have been “played by disingenuous politicians and contribute to post-truth miasmas”.

It is easy to agree with Pinker that it would be wonderful if universities and the news media could become paragons of viewpoint diversity and critical thinking. However, movement toward that goal will require large numbers of individuals to enlist for a ‘long march’ to re-establish norms of rationality in institutions that specialize in creating and sharing knowledge.