Friday, September 21, 2018

Why read a book providing advice to radicals?


I doubt whether many people would consider me to be a radical, even though I look forward to the withering away of the state as the social singularity subverts government activities. My views about politics have been most strongly influenced by people who were once considered to be radicals, including John Locke and Adam Smith, but these days people who hold such views are more likely to be described as conservatives. Following Friedrich Hayek, I reject the conservative label because I am strongly opposed to the use of the powers of government to resist spontaneous social change.

I have been reading Derek Wall’s book: Elinor Ostrom’s Rules for Radicals: Cooperative alternatives beyond markets and states. My main reason for reading the book was my previous advocacy, on this blog, of Elinor Ostrom’s approach to discussion of economic and social issues as a means of promoting dialogue across ideological divides. Elinor Ostrom argued that instead of presuming that individuals sharing common pool resources will inevitably experience the tragedy of the commons, we should leave ideology aside and seek to learn from experience why some efforts to solve commons problems have succeeded while others have failed. I suggested that if we apply Elinor Ostrom’s research methodology to national politics we should also seek to learn from experience why some countries have been more successful than others in coping with the tendency of interest group activity to have wealth-destroying impacts that are analogous to over-fishing.

Derek Wall describes himself as a “left-wing member of a Green Party”. When I started reading the book I didn’t expect to be able to endorse it as suitable reading for anyone other than people who self-identify as having radical views, or have some desire to be able to have a dialogue with radicals. The fact that I endorse it as worthwhile reading for a wider audience illustrates the potential for Elinor Ostrom’s views to have wide appeal across the ideological spectrum. The nonpolemical tone of the book is a credit to the author. The deep impression that Elinor Ostrom’s views have had on Derek Wall will be obvious to everyone who reads the book.

The rules for radicals that Derek Wall has derived from Elinor Ostrom’s writings are listed below, with some brief explanation summarised from the book:

1. Think about institutions. Economic activity is shaped by institutional rules. Formal rules are less important than the “dos and don’t that one learns on the ground that may not exist in any written document”.

2. Pose social change as problem solving. Those who look at politics and economics in an abstract way often fail to deal effectively with particular issues.

3. Embrace diversity. Polycentricism promotes good decision-making. The idea of a god-like leader or committee with perfect information is a myth.

4. Be specific. Move from slogans to analysis. Keep asking what can we specifically do in a specific context.

5. Listen to the people. People who participate in commons may be more likely to have good ideas about solving problems than outside experts.

6. Self-government is possible. The Ostrom approach of promoting self-government at a local level provides an attractive alternative to both top-down bureaucratic management and exercise of power by populist politicians.

7. Everything changes. Evolution happens. Technological change is creating new opportunities for collective economic activity e.g. Wikipedia.

8. Map power. If you can map flows of power, you are in a better position to change the flows.

9. Collective ownership can work. It is not always utopian and unrealistic.

10. Human beings are part of nature too.  Ecological problems are profoundly political. The politics of humanity has an influence on the rest of nature.

11. All institutions are constructed, so can be constructed differently. Communities need to keep adapting and reinventing institutions. Institutional development should occur constantly and engage all citizens.  

12. No panaceas. Imperfect humans cannot design utopia. If we attempt to construct institutional blueprints failure is likely.

13. Complexity does not mean chaos. Polycentricism and overlapping jurisdictions can be more efficient than hierarchical structures with linear chains of command.

It seems to me that most of those rules are as relevant to conservatives as to radicals. In all modern democracies conservatives and radicals seem to share the misconception that all economic and social problems can be solved if they can win and hold on to power at a national level.

Monday, September 10, 2018

Can MEMEnomics help us to predict social change?



MEMEnomics is the title of a book by Said Dawlabani, a cultural economist. The book, published in 2013, is an application of the psycho-social model of human development pioneered by Clare Graves and Don Beck. MEMEnomics has been praised by several prominent people, including Deepak Chopra and Bruce Lipton, but I have yet to see any praise by prominent economists. The author does not claim that his book is part of the economics mainstream.

Said Dawlabani suggests that MEMEnomics represents the coming together of two fields: memetics – the study of the replication, spread and evolution of memes - and economics. Just as genes carry the codes that define human characteristics, memes carry the codes that define cultural characteristics. The book is focused on value-system memes - the varying preferences and priorities that humans have in their lives depending on their level of development. The way human values may change with levels of human development was discussed in a recent post on this blog.

The author defines MEMEnomics as “the study of the long-term effects of economic policy on culture as seen through the prism of value systems”. Much of the book is devoted to attempting to explore the cultural implications of changes in economic policy in the United States. The author recognizes the desirability of ensuring that his model can explain history before it is used to attempt to predict the future.

There are three memenomic cycles identified in the book:

·         a “fiefdoms of power” cycle, peaking around 1900, in which American industrialists played a dominant role - large-scale exploitation, fraud and corruption came to identify the values of that era;

·         a “patriotic prosperity” cycle, peaking around 1950, characterized by economic expansion and government intervention – Keynesian macro-policies and social polices – and ending in stagflation;

·         and an “only money matters” cycle, peaking around 1980, characterized by monetarism and deregulation of the economy, and leading to the financial crisis of 2008.

I am not sure the author succeeds in demonstrating that changes in economic policies have led to cultural change. The cycles identified seem to me to be caricatures of beliefs held by powerful elites rather than accurate descriptions of deep-seated changes in values held by ordinary citizens. Nevertheless, it might be reasonable to argue that the cycles represent changes in ideologies of opinion leaders that have been reflected superficially in voting preferences and priorities of the American public.

The author suggests that we are standing on the cusp of a fourth cycle, “the democratization of information cycle”, in which technological advances are allowing social networks to play a pivotal role in affecting social change. That view has merit in my view, but I think this technology-driven change is better viewed as an exogenous factor rather than a new ideology emerging from the down-side of “only money matters”. At this stage it seems that, in the aftermath of the 2008 financial crisis, social networks have aided the return of economic nationalism rather than a policy environment placing higher priority on human development and living in harmony with nature.

As discussed in previous posts (here and here) there does seem to be scope for technological advances to have profound impacts on human values and the way we organise ourselves relative to each other over the next few decades. However, since some of those innovations threaten the scope of government, it seems unlikely that government policy will play a top-down role encouraging them to happen. Policy change seems more likely to occur in response to the demands of ordinary citizens for governments to get out of the way, so citizens can make effective use of new technology.

I enjoyed reading the final chapter of the book discussing the concept of a sustainable corporation. Inspirational examples are provided of corporation leaders setting out to define how the core values of their organisations can enable them to simultaneously pursue profits and a higher purpose. Unfortunately, some of the shining examples of 2013 do not all shine so brightly today.  Said Dawlabani has written an interesting article recently on the reasons why that has happened.
 Entrepreneurs who are selling new sets of values to investors, staff and customers will always encounter naysayers. In the face of this negativism some of these pioneers will succeed, many will not.

One of the messages I get from MEMEnomics is that individual entrepreneurs are likely to play a crucial leadership role in facilitating transition from a subsistence value system limited to expressions of selfish interests, to a value system that understands the interconnectedness of all life on the planet.

It strikes me that for economics to shed light on the role of the entrepreneur in this process it needs to recognize that the value created by entrepreneurs is likely to have a large non-pecuniary component in future. In pursuit of personal values some innovative entrepreneurs are offering investors the opportunity to feel that their funds are being used for the betterment of humanity and/or the environment, as well as generating financial returns. Similarly, they are offering employees the opportunity to feel they are engaged in a meaningful venture rather than just an income earning activity, and are also offering consumers opportunities to feel good about their purchases.

The economic model that seems most relevant in this context is 'identity economics' - as discussed in a book of that name by George Akerlof and Rachel Kranton. The key idea is that people gain satisfaction when their actions conform to the norms and ideals of their identity. In a tribal society, identity economics is like identity politics – people adopt the norms and ideals of the tribe to which they belong. In a cosmopolitan society the relevant norms and ideals are those of the market economy, incorporating a large measure of respect for the rights of others and social trust. Over the next few decades, hopefully the relevant norms and ideals will incorporate greater concern for the well-being of all humans and other living creatures.

Saturday, July 28, 2018

How will human values evolve as we approach the social singularity?


As explained in a recent post, Max Borders has coined the term, social singularity, to describe the transformation in social organisation that could occur following mass adoption of secure networking technologies. Some existing mediating structures could become obsolete, new forms of coordination could emerge and we might collaborate as never before.

In his book, The Social Singularity, Max relies heavily on spiral dynamics to discuss the way cultural values may evolve as we approach the social singularity. Spiral dynamics was developed by the psychologist Care Graves and popularised by Don Beck and Christopher Cowan. It postulates that at different stages of development different values become dominant to help people to function in the life circumstances in which they find themselves.

The spiral is summarised in the graphic shown at the beginning of this post (copied from the toolshero web site). In brief, at first stage of the spiral, survival values are dominant. At the second stage, the dominant values are those of the tribe or clan. At the next stage, we have values related to power, glory and conquest. Then we have loyalty and deference to higher authority. This is followed by the values of science and commerce, and then the ethics of care and the politics of equality.

As we approach the social singularity, prior value systems will be transcended: more people will come to see themselves as interdependent beings, requiring some autonomy and respecting the autonomy of others. Beck and Cowan described the final, holistic, stage as an integrative system that “combines an organism’s necessary self-interest with the interests of the communities in which it participates”.  Max comments:

“This way of seeing the world is neither rugged individualism not crude communitarianism. It requires seeing ourselves through others and others through ourselves”.

What evidence do we have that humanity is heading in that direction? Questions have been raised as to whether spiral dynamics is firmly grounded in evolutionary biology and anthropology, but from the little I know of ancient history it seems to provide a plausible account of the way different cultures have emphasized different virtues. If we look at the economic history of the last few centuries, the story told by spiral dynamics seems consistent with the work of Joel Mokyr and Deirdre McCloskey about the emergence of a culture of economic growth, first in western Europe and then spreading to other parts of the world. The theory also seems consistent with the empirical work of Ronald Inglehart and Chis Welzel on value change, based on the World Values Survey. As noted on this blog a few years ago Chris Welzel’s book Freedom Rising provides evidence that as societies have advanced in terms of technological sophistication and education, emancipative values - relating to autonomy, choice, equality etc. - have more widely shared and the dominant life strategies of populations have shifted from an extrinsic focus on material circumstances to an intrinsic focus on emotional qualities.

That research doesn’t tell us how dominant values might evolve in the years ahead, but Max Borders makes clear that he sees people who are comfortable with subversive innovation – innovation that has potential to replace existing mediating structures including government agencies - as “the standard bearers for a future in which a better world can be dreamed by visionaries, socially constructed, and hard-coded into existence”. Max adds:

“As dreamers and doers, we are prepared to forgo the spectacle of elections and the blood sport of campaign politics. We want to take a vantage point from high above, looking at how we can reweave the latticework of human interaction to create a great reconciliation between private interest and community good."

If we view spiral dynamics and the values of the social singularity in normative terms, Robert Nozick’s suggestion that the pursuit of higher layers of ethics can be thought of as building on the ethics of respect, seems highly relevant. As I noted some years ago, Nozick saw four layers of ethics:

·         The most fundamental layer - the ethics of respect - mandates respect for the life and property of other people.

·         The second layer – the ethics of responsiveness – mandates acting in a way that is responsive to the inherent value of others, enhancing and supporting it, and enabling it to flourish.

·         The third layer – the ethics of caring – ranges from concern and tenderness to deeper compassion, ahimsa and love to all people (perhaps to all living creatures).

·         The top layer – the ethics of Light – calls for being a vessel and vehicle of truth, beauty, goodness and holiness.

Subversive innovation offers a basis to hope that the ethics of Light could one day pervade the cultural values of many humans rather than those of only a few saints and sages.

Friday, July 20, 2018

How can we overcome confirmation bias?


This guest post by Leah Goldrick was originally published on her excellent blog, Common Sense Ethics. Leah acknowledges that confirmation bias is linked to pattern recognition, which serves a useful purpose. The confirmation bias problem arises when we seek out information to confirm what we believe and ignore everything else.

The documentary that Leah refers to in her first paragraph is worth watching. It illustrates how easy it was for a group of people who did not appear likely to be particularly gullible to acquire an unshakeable belief that the end of the world would occur on 21 May 2011.



Why is it so hard to for us the change our beliefs or to get other people to change their minds? A new documentary film Right Between Your Ears, examines the science and psychology of how people form convictions. According to producer Kris De Meyer, a neuroscientist, certain aspects of human psychology make it very hard for us to be objective or unbiased.

People usually form beliefs by accepting what they've been told by someone they trust: parents, teachers, media and so on. Our beliefs can change when we learn new things. But when we become convinced of something, it is similar to a religious belief in the way our brain operates. We may react with anger when challenged. This human tendency often leads us to seek out information which confirms what we already believe and ignore everything else - it's a cognitive bias actually - called confirmation bias.

It seems obvious why confirmation bias can be a problem - it can prevent us making good decisions. It makes us rigid thinkers. Someone can easily fool us by simply appealing to our established belief systems. The good news is that there are some practical strategies to overcome this natural human shortsightedness that I'll let you in on at the end of the post.

How We Form Beliefs

Let me back up for just a second. What led me to write this post (besides my abiding interest in critical thinking) was the Shakespeare authorship course I recently took online via the University of London. Along with being just about the most interesting topic ever, the instructor, Dr. Ros Barber, focused the first lesson on the science of how beliefs are formed, cognitive bias, and how belief systems can crystallize into orthodoxies which may not be questioned without ridicule.

Dr. Barber interviews ​Kris De Meyer, a neuroscientist and documentary film maker currently working at the Department of Neuroimaging at King's Institute for Psychiatry, Psychology and Neuroscience, about how we form our beliefs in the first place.

According to De Meyer, we form fairly rigid belief systems or perceptual frameworks out of necessity as we go through life in order to handle the information continually coming at us. Usually, our perceptual framework serves us quite well. But it can also be a major intellectual handicap when we are confronted with information which undercuts our established belief systems. De Meyer states:

"But beliefs become strongly held and particularly if we build our identity around them, they begin to act as perception filters. Indeed, it might be useful to think of a belief as a perceptual framework, something that helps us make sense of the world around us." 

Confirmation Bias

The problem with our perceptions being filtered through our belief structures is that it can create something called confirmation bias. We tend to interpret new information in a way that strengthens our preexisting beliefs. ​​When we are confronted with information which conflicts with our beliefs, we will often find ways to discard it. We also tend to search out information which confirms our beliefs rather than looking for more neutral or contradictory information.

For our general functioning in the world, we must keep our perceptual frameworks fairly rigid. So even when our brain finds data that is anomalous, confirmation bias can lead us to explain it away as an error. Experiments in the 1960s hinted that people are biased towards their beliefs. Later experiments focused on our natural tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives.

Anyone can suffer from confirmation bias: teachers, Shakespeare scholars, even scientists. In one study on confirmation bias involving scientists, over half of laboratory experimental results were inconsistent with the scientists' original hypotheses. In these cases, the scientists were reluctant to consider that data as valid. The anomalous finding was usually classified as a mistake. Even after scientists had produced an anomaly more than once, they would often choose not to follow up.

When we perceive, we construct systems of beliefs inside of our heads like a lawyer trying to prove a case. The more strongly we are engaged in a topic, the more likely we are to dismiss contradictory evidence. Basically on both sides of any debate, we have a system beliefs that tells us that we are right and the other side is wrong.

According to Ros Barber, "[When any conflict happens] it's been described as "a dialog of the deaf" because people can't hear the other point of view. They just think it's totally invalid." 

Cognitive Dissonance

So why does confirmation bias happen? It might be because of wishful thinking, or because of our limited mental capacity to process information. It could also have to do with a failure to imagine alternate possibilities (more on this later). Another explanation for confirmation bias is that people are afraid of being wrong, and fail to ask the right probing questions about their beliefs, instead reasoning from their already held conclusions.

When we are confronted with contradictory evidence, it causes something called cognitive dissonance - mental distress caused by information that doesn't fit in with our current understanding of the world. Cognitive dissonance is uncomfortable and people will sometimes go to great lengths to avoid it.

​Cognitive dissonance was first theorized by psychologist Leon Festinger who argued that we have an inner drive to hold consistent beliefs. Holding inconsistent beliefs causes us to feel disharmonious. Festinger studied a cult whose members believed that the earth was going to be destroyed by a flood. He investigated what happened to the cult members, especially the committed ones who had given up their homes and jobs, after the flood did not happen on the proposed date.

The most committed cult members were more likely to rationalize their original beliefs (confirmation bias) even after experiencing cognitive dissonance in the face of the flood not happening. Loosely affiliated members were much more likely to admit that they had simply made a mistake and to move on. The more attached we are to a belief, the harder it is to change it. 

How To Think (and Debate) With Less Bias

​So what are the best strategies to overcome our natural human shortsightedness and bias? The first is to keep emotional distance in reasoning, and the second is to consider the other side (or sides) of any debate, a technique called the "consider the opposite," strategy.

1. Keep Emotional Distance When Reasoning

Given the natural human tendency towards confirmation bias, it is important to be at least somewhat dispassionate when reasoning and debating. I like to call this emotional distance. Emotional distance is just as much a character trait of a reasonable person as it is a strategy for handling cognitive biases.

Confirmation bias may in part stem from our desire not to be wrong, so by keeping emotional distance, you essentially are willing admit to yourself that you could have some things wrong. Don't be too attached to any particular piece of evidence. In any difficult debate we all may get certain parts of the puzzle incorrect.

Look out for signs of confirmation bias in yourself. ​ Remember that the more strongly held your beliefs are, the more likely you are to refuse to consider alternative evidence - like the cult members who invested everything in their belief in an impending flood.

Emotional distance also involves viewing debate as dialog rather than an angry fight. If your ego gets too caught up defending a certain belief, you are more likely to get angry when contradicted. Angry people usually double down and become more extreme on their point of view rather than considering someone else's. Keep in mind that politeness might actually be the secret weapon for getting someone to overcome their bias. Kris De Meyer suggests:

"When we do feel a pang of anger at being challenged, rather than responding very quickly we can step away from it for maybe a few hours or a day, think carefully about where that person is coming from, and then see if we can give them more constructive comments that then doesn't spark his or her angry response. Because it's those feelings of anger at being misunderstood and of being misrepresented that really are the ones that drive us towards more certainty. And if the conversation becomes amicable, it can be heated and passionate without being acrimonious and negative. The way to use [your knowledge of confirmation bias] is to question yourself and to reflect on your own assumptions and your own interactions with other people."

Maintaining emotional distance is powerful, but it may not be enough to overcome biases, which is why we should also use this second strategy:

2. Consider the Opposite

Confirmation bias may in part be the result of our limited mental capacity to imagine alternative scenarios. The consider the opposite strategy helps us to envision how else things might be. In a recent study, this technique was proven to work better than just attempting to remain objective.

Considering the opposite in everyday practice works like this: you take a look at a set of facts about something. Generally, you would try to discern whether the facts support your belief or not. If you are experiencing confirmation bias, you would probably imagine that the facts do actually support your belief. But when you force yourself to consider the opposite, you instead imagine that the facts point the opposite way, disproving your belief. This helps you to imagine alternatives to what you already believe.

The consider the opposite strategy works particularly well with diametrically opposed beliefs, but always bear in mind that there may be more than one alternate possibility. Be willing to entertain various possibilities rather than falling victim to false dichotomies.