Showing posts with label Leadership. Show all posts
Showing posts with label Leadership. Show all posts

Friday, July 20, 2018

How can we overcome confirmation bias?


This guest post by Leah Goldrick was originally published on her excellent blog, Common Sense Ethics. Leah acknowledges that confirmation bias is linked to pattern recognition, which serves a useful purpose. The confirmation bias problem arises when we seek out information to confirm what we believe and ignore everything else.

The documentary that Leah refers to in her first paragraph is worth watching. It illustrates how easy it was for a group of people who did not appear likely to be particularly gullible to acquire an unshakeable belief that the end of the world would occur on 21 May 2011.



Why is it so hard to for us the change our beliefs or to get other people to change their minds? A new documentary film Right Between Your Ears, examines the science and psychology of how people form convictions. According to producer Kris De Meyer, a neuroscientist, certain aspects of human psychology make it very hard for us to be objective or unbiased.

People usually form beliefs by accepting what they've been told by someone they trust: parents, teachers, media and so on. Our beliefs can change when we learn new things. But when we become convinced of something, it is similar to a religious belief in the way our brain operates. We may react with anger when challenged. This human tendency often leads us to seek out information which confirms what we already believe and ignore everything else - it's a cognitive bias actually - called confirmation bias.

It seems obvious why confirmation bias can be a problem - it can prevent us making good decisions. It makes us rigid thinkers. Someone can easily fool us by simply appealing to our established belief systems. The good news is that there are some practical strategies to overcome this natural human shortsightedness that I'll let you in on at the end of the post.

How We Form Beliefs

Let me back up for just a second. What led me to write this post (besides my abiding interest in critical thinking) was the Shakespeare authorship course I recently took online via the University of London. Along with being just about the most interesting topic ever, the instructor, Dr. Ros Barber, focused the first lesson on the science of how beliefs are formed, cognitive bias, and how belief systems can crystallize into orthodoxies which may not be questioned without ridicule.

Dr. Barber interviews ​Kris De Meyer, a neuroscientist and documentary film maker currently working at the Department of Neuroimaging at King's Institute for Psychiatry, Psychology and Neuroscience, about how we form our beliefs in the first place.

According to De Meyer, we form fairly rigid belief systems or perceptual frameworks out of necessity as we go through life in order to handle the information continually coming at us. Usually, our perceptual framework serves us quite well. But it can also be a major intellectual handicap when we are confronted with information which undercuts our established belief systems. De Meyer states:

"But beliefs become strongly held and particularly if we build our identity around them, they begin to act as perception filters. Indeed, it might be useful to think of a belief as a perceptual framework, something that helps us make sense of the world around us." 

Confirmation Bias

The problem with our perceptions being filtered through our belief structures is that it can create something called confirmation bias. We tend to interpret new information in a way that strengthens our preexisting beliefs. ​​When we are confronted with information which conflicts with our beliefs, we will often find ways to discard it. We also tend to search out information which confirms our beliefs rather than looking for more neutral or contradictory information.

For our general functioning in the world, we must keep our perceptual frameworks fairly rigid. So even when our brain finds data that is anomalous, confirmation bias can lead us to explain it away as an error. Experiments in the 1960s hinted that people are biased towards their beliefs. Later experiments focused on our natural tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives.

Anyone can suffer from confirmation bias: teachers, Shakespeare scholars, even scientists. In one study on confirmation bias involving scientists, over half of laboratory experimental results were inconsistent with the scientists' original hypotheses. In these cases, the scientists were reluctant to consider that data as valid. The anomalous finding was usually classified as a mistake. Even after scientists had produced an anomaly more than once, they would often choose not to follow up.

When we perceive, we construct systems of beliefs inside of our heads like a lawyer trying to prove a case. The more strongly we are engaged in a topic, the more likely we are to dismiss contradictory evidence. Basically on both sides of any debate, we have a system beliefs that tells us that we are right and the other side is wrong.

According to Ros Barber, "[When any conflict happens] it's been described as "a dialog of the deaf" because people can't hear the other point of view. They just think it's totally invalid." 

Cognitive Dissonance

So why does confirmation bias happen? It might be because of wishful thinking, or because of our limited mental capacity to process information. It could also have to do with a failure to imagine alternate possibilities (more on this later). Another explanation for confirmation bias is that people are afraid of being wrong, and fail to ask the right probing questions about their beliefs, instead reasoning from their already held conclusions.

When we are confronted with contradictory evidence, it causes something called cognitive dissonance - mental distress caused by information that doesn't fit in with our current understanding of the world. Cognitive dissonance is uncomfortable and people will sometimes go to great lengths to avoid it.

​Cognitive dissonance was first theorized by psychologist Leon Festinger who argued that we have an inner drive to hold consistent beliefs. Holding inconsistent beliefs causes us to feel disharmonious. Festinger studied a cult whose members believed that the earth was going to be destroyed by a flood. He investigated what happened to the cult members, especially the committed ones who had given up their homes and jobs, after the flood did not happen on the proposed date.

The most committed cult members were more likely to rationalize their original beliefs (confirmation bias) even after experiencing cognitive dissonance in the face of the flood not happening. Loosely affiliated members were much more likely to admit that they had simply made a mistake and to move on. The more attached we are to a belief, the harder it is to change it. 

How To Think (and Debate) With Less Bias

​So what are the best strategies to overcome our natural human shortsightedness and bias? The first is to keep emotional distance in reasoning, and the second is to consider the other side (or sides) of any debate, a technique called the "consider the opposite," strategy.

1. Keep Emotional Distance When Reasoning

Given the natural human tendency towards confirmation bias, it is important to be at least somewhat dispassionate when reasoning and debating. I like to call this emotional distance. Emotional distance is just as much a character trait of a reasonable person as it is a strategy for handling cognitive biases.

Confirmation bias may in part stem from our desire not to be wrong, so by keeping emotional distance, you essentially are willing admit to yourself that you could have some things wrong. Don't be too attached to any particular piece of evidence. In any difficult debate we all may get certain parts of the puzzle incorrect.

Look out for signs of confirmation bias in yourself. ​ Remember that the more strongly held your beliefs are, the more likely you are to refuse to consider alternative evidence - like the cult members who invested everything in their belief in an impending flood.

Emotional distance also involves viewing debate as dialog rather than an angry fight. If your ego gets too caught up defending a certain belief, you are more likely to get angry when contradicted. Angry people usually double down and become more extreme on their point of view rather than considering someone else's. Keep in mind that politeness might actually be the secret weapon for getting someone to overcome their bias. Kris De Meyer suggests:

"When we do feel a pang of anger at being challenged, rather than responding very quickly we can step away from it for maybe a few hours or a day, think carefully about where that person is coming from, and then see if we can give them more constructive comments that then doesn't spark his or her angry response. Because it's those feelings of anger at being misunderstood and of being misrepresented that really are the ones that drive us towards more certainty. And if the conversation becomes amicable, it can be heated and passionate without being acrimonious and negative. The way to use [your knowledge of confirmation bias] is to question yourself and to reflect on your own assumptions and your own interactions with other people."

Maintaining emotional distance is powerful, but it may not be enough to overcome biases, which is why we should also use this second strategy:

2. Consider the Opposite

Confirmation bias may in part be the result of our limited mental capacity to imagine alternative scenarios. The consider the opposite strategy helps us to envision how else things might be. In a recent study, this technique was proven to work better than just attempting to remain objective.

Considering the opposite in everyday practice works like this: you take a look at a set of facts about something. Generally, you would try to discern whether the facts support your belief or not. If you are experiencing confirmation bias, you would probably imagine that the facts do actually support your belief. But when you force yourself to consider the opposite, you instead imagine that the facts point the opposite way, disproving your belief. This helps you to imagine alternatives to what you already believe.

The consider the opposite strategy works particularly well with diametrically opposed beliefs, but always bear in mind that there may be more than one alternate possibility. Be willing to entertain various possibilities rather than falling victim to false dichotomies. 

Sunday, June 24, 2018

How can people become more open to critical evaluation of their own views?


It might not be obvious to everyone that it is desirable for people to be open to critical examination of their own views. The process of critical examination takes time and energy and can be unsettling. If it leads a person to change his or her view, relatives and friends might disapprove.


What is the problem with immunity to change? One problem is failure to actualize potential. In the first chapter of their book, Immunity to Change: How to overcome it and unlock the potential in yourself and your organisation, Robert Kegan and Lisa Laskow Lahey provide evidence suggesting that immunity to change of attitude tends to hinder mental development of adults. Survey data indicates that there is potential for mental development to continue throughout adulthood, at least until old age. Development tends to occur unevenly, with periods of change followed by periods of stability.

Researchers have identified three adult plateaus of development corresponding to different meaning systems that people use to make sense of the world and operate within it:

·         A socialized mind enables an individual to be a faithful follower and team player.

·         A self-authoring mind can generate an internal belief system/ ideology/ personal code and is self-directed. It places priority on receiving the information it has sought and creates a filter which determines what information it allows to come through.

·         A self-transforming mind can step back from and reflect on the limits of personal ideology and systems of self-organisation. Individuals at this level of mental development value the filter they have created to separate the wheat from the chaff, but they also value information that may alert them to limits of their filter.

Individuals at each successive level of mental development can perform the mental functions of the prior level as well as additional functions. A person who had attained the self-transforming stage of development can be self-authoring when required to develop and execute a plan of action, and can also be a team player when that is appropriate.

Studies involving several hundred participants suggest that most people (58% of respondents) have not attained a self-authoring level of development. Of the remainder, only a tiny percentage have self-transforming minds. The studies probably exaggerate the level of personal development of most of the population because they were skewed towards middle-class professionals.

This research seems highly relevant to questions considered recently on this blog about echo chambers in the social media and the reluctance of many people to listen to opposing viewpoints, as well as to consideration of the ingredients of good leadership. The vast majority of those who aspire to be able to reflect objectively on the limitations of their views of how the world works are likely to be biased against seeking information that might challenge those views.

As the title of the book suggests, Immunity to Change is about overcoming the psychological resistance that that prevent us from making the changes we want to make in our own lives and within organisations. The book is replete with examples, drawn from the extensive consulting experience of the authors, to illustrate how people can identify and deal with hidden fears that prevent them from making the changes they want to make. Most readers of this book are probably aspiring to leadership positions or attempting to change organisations, but much of the material in it is relevant to anyone who is attempting to make changes in their lives.

I will focus here on the approach to overcoming internal resistance that the book might suggest for a person who wants to become more open to critical evaluation of his or her own views on issues that have become highly politicized. I will provide my own responses to the series of questions suggested by the authors, rather than speculate about how others might respond. Hopefully my introspection will have some relevant to others.

1.       What is your improvement goal?

As already noted, I want to be more open to critical evaluation of my views on issues that have become politicised. My reason for doing this is that I suspect the opposing side on such issues might sometimes have genuine concerns that are worth considering.

2.       What are you doing/ not doing instead?

I rarely read opinion pieces by commentators whom I consider likely to be opposed to my views on controversial issues. I have sometimes expressed the view that I need to be paid to read such commentary.

When friends and relatives challenge my views on controversial issues, my response is often overly defensive. I begin such conversations with the intention of ensuring I understand the opposing point of view, but I am easily diverted to point scoring.

3.       What hidden competing commitments prevent achievement of your improvement goal?

When I imagine myself reading commentary that is opposed to my views I feel that I am likely to be bored by a recitation of views that I have previously rejected. It seems like a waste of time. However, I must also acknowledge fear that reading such commentary could be unsettling. The authors of these pieces often do their best to appeal to the emotions of their readers. I acknowledge some concern that I might need to modify my views if I start feeling sympathy for the plight of victims of policies that I support. The hidden commitments underlying those concerns are not feeling unsettled and not being swayed by appeals to emotion.

My defensiveness in conversations on controversial topics with people with opposing viewpoints seems to be related to the tendency for such conversations to degenerate into point-scoring exercises in which participants attempt to attach labels to each other. I am concerned that I might respond in kind if conversation partners disrespect me. The hidden commitments are to avoid being labelled and to avoid losing self-control.

4.       What are the big assumptions that underlie this immune system?

I accept that the hidden commitments identified above act as an immune system to prevent progress toward my improvement goal. I can see why I am unlikely to be able to make much progress merely by forcing myself to read commentary that is opposed to my views, or by telling myself not to become defensive when discussing controversial issues. The hidden commitments identified above have been acting as an anxiety reduction system.

 The authors of Immunity to Change explain the concept of “big assumption” as follows:

"We use the concept of big assumptions to signal that there are some ways we understand ourselves and the world (and the relationship between the world and ourselves) that we do not see as mental constructions. Rather, we see them as truths, incontrovertible facts, accurate representations of how we and the world are.
These constructions of reality are actually assumptions; they may well be true, but they also may not be. When we treat an assumption as if it is a truth, we have made it what we call a big assumption."

The big assumptions underlying the hidden commitments I have identified seem to be related to self-trust. There is an assumption that I can’t trust myself to feel sympathy for the plight of some unfortunate people without losing my mental faculties. There is also an assumption that I can’t trust myself not to lose control if I am disrespected.

Identifying those big assumptions was an “aha” moment for me. The absurdity of the assumptions seemed obvious as soon as they were identified.

However, Kegan and Lahey emphasize that the process of overcoming immunity to change does not end with identifying big assumptions. The next step is to design tests capable of disconfirming the big assumptions. The tests involve changes in usual conduct that generate information that we can reflect upon to challenge the big assumptions. The authors emphasize that the purpose of running the tests is not to see whether performance has improved, but to generate information to provide a learning experience.

This is where my story ends. In writing this article I have ‘tied myself to the mast’ with a public commitment to test my big assumptions. However, it could be counterproductive to disclose what tests I have in mind, and I’m certainly not going to promise to write a sequel to tell you what happens.

Even if it achieves nothing more, this exercise of identifying big assumptions has made me more appreciative that the difficulty other people have in being open to critical evaluation of their own views could well be attributable to deep-seated fears.
I recommend Immunity to Change to anyone struggling to understand why they are having difficulty in making the changes they want to make in their own behaviours.

Saturday, June 9, 2018

What should be done about echo chambers in the social media?



Why bother reading a book by Cass Sunstein which suggests that echo chambers in the social media are becoming a problem for democracy and that something should be done about them? That was a question I had to ask myself before deciding to read Sunstein’s recently published book, Republic: Divided Democracy in the Age of Social Media.

The people who are likely to be most enthusiastic about reading this book will be concerned about echo chambers already, and be fans of Sunstein. I was already concerned about echo chambers before reading the book, but reading other books by Sunstein did not induce me to join his fan club. From his interview about this book with Russ Roberts on Econ Talk, I thought some of the views presented would be challenging. 
I was in no hurry to read the book.

That illustrates a problem with echo chambers. Many of us have a tendency to avoid being challenged even when there is potential to learn something useful from people who have opposing viewpoints. I only read the book because I have recently been thinking and writing about the potential benefits of listening to opposing viewpoints.

The book was worth reading to help me clarify my own views. In summary, Sunstein suggests: 
“to the extent that people are using social media to create echo chambers, and wall themselves off from topics and opinions that they would prefer to avoid, they are creating serious dangers. And if we believe that a system of free expression calls for unrestricted choices by individual consumers, we will not even understand the dangers as such”.

The serious dangers that Sunstein is referring to include group polarisation, the spreading of falsehoods within echo chambers, a high degree of social fragmentation and greater difficulty of mutual understanding.

The author doesn’t claim that this is currently the general pattern, or that group polarisation and cybercascades are always bad. He recognizes that it is sometimes good for a perception or point of view to spread rapidly among a group of like-minded people. His claim is that group polarisation can, nevertheless, be a significant risk even if only a small number of people choose to listen and speak solely with those who are like-minded. Enclave deliberation can cause members of groups to move to positions that lack merit e.g. terrorist agendas. “In the extreme case, enclave deliberation may even put social stability at risk”.

Turning to the second part of the quoted passage, readers may wonder how Sunstein can argue that a system of free expression can be consistent with regulation of consumer choices.  His argument seems to rest on two propositions:

·         First, free speech is not an absolute – despite the free speech guarantee in the U.S. constitution, government is permitted to restrict speech in various ways e.g. attempted bribery, criminal conspiracy, child pornography.

·         Second, the free speech principle should be read in light of the commitment to democratic deliberation rather than consumer sovereignty. From the perspective of supporting democratic deliberation, regulation of television, radio and the Internet may be permissible to promote democratic goals.

I’m uneasy about the second proposition. The U.S. Supreme Court would presumably disallow legislation which purported to support democratic deliberation in a manner that conflicted seriously with fundamental freedoms. In parliamentary systems that have no constitutional guarantees of liberty, however, legislative action to support democratic deliberation could be far-reaching and ideological. For example,  it could mandate coverage in school curriculums of the foundations of democracy in the history of western civilization, or alternatively, its foundation in the history of protest movements and revolutions.

The purpose for which Sunstein seeks government action to support democratic deliberation is to ensure a measure of social integration by promoting exposure of people to issues and views that might otherwise escape their attention. He writes:

“A society with general-interest intermediaries, like a society with a robust set of public forums, promotes a shared set of experiences at the same time that it exposes countless people to information and opinions that they would not have sought out in advance. These features of a well-functioning system of free expression might well be compromised when individuals personalize their own communications packages—and certainly if they personalize in a way that narrows their horizons”.

I support those sentiments  but I am wary of government intervention in support of them.  Seemingly benign government action in support of public forums can be counterproductive. I have in mind particularly the Q&A program of Australia’s public broadcaster. This is a taxpayer funded public forum which exposes people to opinions they would not seek to be exposed to. On issues that have become politicized, the people watching the show might be entertained by the antics of those presenting opposing views but are unlikely to have gained a better understanding of the issues.  

There are already many public forums on the Internet. If people choose to join forums that don’t welcome dissent from prevailing views that is akin to people avoiding public places where public demonstrations are held. That choice should be respected. 
If a growing proportion of the population chooses to spend an increasing proportion of their time echo chambers rather than open forums, that is a cultural problem with potential implications for democratic deliberation.  it should be dealt with as a cultural problem rather than a public policy problem.

Those of us who are concerned that echo chambers are becoming more prevalent should remember that sectarian echo chambers have warped democratic deliberation in the past. How were those religion-based echo chambers dismantled? I can’t claim to know much about the history, but I doubt that government intervention played a significant role. It was a cultural shift. It was presumably led by influential people within some factional forums who took a stand in favour of allowing dissenting voices to be heard. Influential people outside the echo chambers must also been active in encouraging individuals to think for themselves rather than to parrot the views of church leaders and sectarian politicians. In many organisations, tolerance of dissent came to be viewed as the norm and thinking for one’s self came to be viewed as a virtue.

Could that happen again?

Tuesday, May 29, 2018

What are the ingredients of good leadership?


As I contemplate leadership failures in some major organisations, in Australia and elsewhere, it strikes me that the people responsible for those failures have not been meeting the norms of behaviour expected of responsible adults. For example, it doesn’t seem like responsible adult behaviour to persist in charging customers for services that they haven’t received.

That has me wondering whether the prevailing emphasis on inspiring organisational leadership rather than efficient administration could be responsible for a decline in the quality of senior executives. It seems to have become possible for some people to rise to the top by learning how to present a vision and flatter stakeholders, without acquiring management skills and business ethics along the way. Perhaps we are seeing a shallow leadership culture displacing the long-standing management culture that encouraged business leaders to take pride in being trustworthy.


Should the gurus who began promoting an emphasis on organisational leadership about 30 years ago be held responsible for the shallowness of leadership in some modern organisations today?  As that question arose in my mind I decided to revisit a book that I had read about 30 years ago - On Becoming a Leader by Warren Bennis, a famous leadership guru. I had a vague recollection that Bennis argued that organisations need leaders, not managers. 
My recollection was correct. The book contains a heading: “Leaders, Not Managers”. Under that heading there is a list of differences between leaders and managers. For example: “The manager administers; the leader innovates” and “The manager has his eye always on the bottom line; the leader has his eye on the horizon”. I don’t see recognition that organisations need leaders who have both high-level management and leadership capabilities.

However, the concept of leadership that Bennis advanced is far from shallow. He can’t be held responsible for readers who think leadership just involves mastering jargon about visions and stakeholders.

Bennis presents the view that “leaders are people who are able to express themselves fully”. He explains:

“The key to full self-expression is understanding one’s self and the world, and the key to understanding is learning – from one’s own life and experience”.

Bennis lists the ingredients of leadership as: a guiding vision; the passion to pursue that vision; integrity (encompassing self-knowledge, candour and maturity); trustworthiness; and curiosity and daring.

Those seem to be characteristics that would be displayed by any flourishing adult. As noted in an earlier post, human flourishing also requires alertness to the new opportunities emerging in changing circumstances.

That makes me to wonder whether there is any difference between the characteristics of a good leader and those displayed by any flourishing adult human. Toastmasters International, an organisation dedicated to assisting members to acquire leadership skills, as well as to improve communication skills, suggests one possible difference: “Great leaders inspire others to follow them”.

That difference is probably not important. Flourishing adults tend to display attributes required to attract followers, even when they don’t seek to be followed. They can’t avoid setting an example of behaviour that some others might choose to follow. As implied in the mission of Toastmasters clubs, the development of communication and leadership skills results in “greater self-confidence and personal growth”.

Perhaps I should try to sum up. It does seem possible that recent leadership problems in some major organisations are attributable to a shallow leadership culture. Some of these problems might have been avoided with a more conventional management culture - less emphasis on public relations and more emphasis on maintaining efficient and ethical management practices. Leadership gurus, such as Warren Bennis, might have contributed to such problems by downplaying the importance management skills. Nevertheless, the ingredients of leadership identified by Bennis are characteristics of flourishing adults - people who act with integrity. Organisations need leaders who have both high-level management and leadership capabilities.

 One question which I have not addressed is whether it is possible to identify intermediate stages in acquiring leadership capabilities. Do you have to learn to think for yourself before you can be a leader? Does Robert Kegan’s concept of self-authoring represent an intermediate stage in development of leadership capabilities?