Category Archives: Decision Making

Steven Levitt & Stephen Dubner: Freakonomics

Surfing Malcolm Gladwell’s wake on the wave of popular social science books came a pair of writers who set the stage for many journalist/social scientist combinations. Steven Levitt was a rising star in the world of economics when he was interviewed by successful journalist Stephen Dubner.

When the publishing world offered sufficient incentives (in the form of an author’s advance), they began their collaboration that has resulted in four books and over 5 million sales. More important, it opened our minds to the world of perverse incentives that the two dubbed ‘Freakonomics’.

Steven Levitt & Stephen Dubner

Steven Levitt & Stephen Dubner

Steven D Levitt

Steven Levitt is a successful academic. Born in New Orleans, in 1967, he studied economics at Harvard, graduating in 1989. He then spent a couple of years in management consulting, specialising in decision-making, before enrolling in a PhD programme at MIT.

His time at MIT was far from conventional. Whilst his peers did the standard thing of analysing case studies and studying theory, Levitt discerned a simple truth about academic life: success depends on published papers. So before even starting his formal thesis work, he was gathering and analysing his own data, conducting his own research, and writing his first papers.

His varied and curious approach to economics, and his succession of published papers, paid off. he was awarded his PhD in 1994 and, following a period as a research fellow at Harvard, was offered a post in arguably the most prestigious economics department in the US, at the University of Chicago. In just two years, he was made a professor.

He is now William Ogden Distinguished Service Professor of Economics and was, in 2003, the recipient of the John Bates Clark Medal. This is awarded every two years by the American Economic Association to the most promising US economist under the age of 40.

In the same year, a New York Times journalist interviewed Levitt for an extended article. That journalist was Stephen J Dubner.

Stephen J Dubner

Stephen Dubner was born in 1963 (AVGY), in New York, and started writing young. His first published work was in a children’s magazine . He studied at Appalachian State University in North Carolina. He graduated in 1984 and focused on a music career until he switched to writing in 1988 and enrolled in a Master of Fine Arts in Writing programme at Columbia University. After graduating in 1990, he taught in the English Department and started work as a journalist, becoming a story editor at The New York Times Magazine.

Dubner’s journalistic writing is highly regarded, and he has also written for Time, The New Yorker, and the Washington Post. In 2003, he interviewed a rising star among academic economists, called Steven Levitt.

The Spirit of Freakonomics

The thing about Freakonomics is that the book series, New York Times columns, and blogs range over a wide arena of social science and economics. What connects it all is the idea that, whilst everyone knows that people respond to incentives, research shows that some of our responses are surprising. So surprising, shocking, delightful, and curious, that the stories of what happens are compelling, and the unravelling of why it happens often reads like the most gripping of detective fiction.

The other vital aspect of the spirit of freakonomics is the combination of an academic economist’s eye for data and the story-telling capability of a seasoned journalist. These are held together by the glue of a shared sense of curiosity and delight in the phenomena that Levitt and Dubner explore.

The books make for a great read. They are thought-provoking and enhanced by Levitt’s analysis of large amounts of data. Indeed, the use of data is another theme. However, this is not to say that  Levitt and Dubner’s conclusions have gone unchallenged. With astonishing claims, like ‘abortion cuts crime’, come a welter of critique.

In some cases the critiques have hit home, in other cases, Levitt and Dubner have successfully countered them. What all of their writing is, is entertaining and thought-provoking. It is no wonder that their books have sold so well. And, on the margins, they also highlight some important truths that managers would do well to note:

  1. People respond to incentives.
  2. People’s response to incentives is not always what you would expect and is sometimes hard to understand.
  3. Big data sets can hold within them valuable and surprising conclusions. We can uncover useful insights and, equally, demolish cherished assumptions.
  4. Working with big data sets in the messy and complex world of human interactions is tricky. Separating coincidence from causation among correlated data is hard. And extracting data where many confounding variables are present will open you up to biting challenge.
  5. Socio-economic evidence should inform policy, but not dictate it.

The Freakonomics Library

Steven Levitt at TED

Steven Levitt has spoken twice at TED events, in 2004 and 2005.

Julia Galef: Scout Mindset

What the world needs now, more than anything else, is a greater degree of rationality. And Julia Galef is on a mission to help us get there.

Julia Galef

Julia Galef

Short Biography

Julia Galef was born in 1983, in Maryland. She studied statistics at Columbia University, graduating in 2005. Initially, Galef continued an academic career, starting an economics PhD course. However, it was not for her, and she moved to New York and began working as a freelance journalist.

There, she joined the New York Skeptics and, with philosopher Massimo Pigliucci started the podcast, Rationally Speaking, in 2010. In 2015, Pigliucci dropped out and Galef continues as the sole host.

In 2011, Galef moved to California to join a group of friends who had secured funding to start the Center for Applied Rationality. It began its work in 2012 and predominantly provides training in how to think more rationally. She is currently its president.

Hang on, Galef is a Public Intellectual…
What has that to do with Management?

Everything.

Management needs to be more rational. It isn’t that there is no place for intuition. It is, however, because intuition only serves us well in situations where we have deep experience.

And in a rapidly changing world where technology, commercial opportunities, and social policy are evolving at a phenomenal rate, none of your really crucial decisions can possibly be based on deep experience. Nobody has that.

So rational thinking is your best strategy for sound decision-making. And that means eliminating bias and exercising the techniques of good judgment.

Soldiers and Scouts: Galef’s Brilliant Metaphor

Galef has a great metaphor for understanding two mindsets, or ways of approaching reality. These mindsets manifest most clearly when we get into discussions or arguments in which we disagree with the other person’s analysis.

Soldier Mindset

A soldier needs to fight to survive. They are therefore trained to be defensive and combative. And by the nature of fighting forces, they are tribal too. The Soldier Mindset is therefore one of feeling safest when we are certain, and fighting against an opponent to protect ourselves. This may be defensive or offensive in nature, but there is value in being right and defending our position – even if it means attacking the other person.

Galef doesn’t say it, but I will. How familiar is this in modern western political discourse?

Scout Mindset

Scouts on the other hand are not tasked to fight, but to gather information. Facts, data and evidence are valuable to a scout, as is objective assessment of what they learn. Consequently, scouts are open to re-evaluate their evaluation, based on new information. The Scout Mindset is one of curiosity and a desire to cut through bias and prejudice to get at the truth. There is value for a scout in testing long-held assumptions and beliefs, so for them, there is no sense of losing face if they need to change their opinion.

Mindset, not Intelligence

This is not about intelligence, any more than Carol Dweck’s Fixed and Growth Mindsets are about intelligence. It is about how we address the complexities of the real world.

If what you value is the certainty of a simple analysis, and don’t want to let a few rogue facts spoil a good story, then you have a Soldier Mindset. And those facts will, eventually, spoil your story.

If, on the other hand, you recognise that the world is complex and the decisions you make are neither straightforward nor familiar, then you may feel you need to interrogate the data fully, listen to different perspectives, and draw careful but provisional conclusions. These will stand until conflicting evidence forces you to re-evaluate.

That is the Scout Mindset, and it sounds like the basis of grown up management to me.

Julia Galef at TED

Here is Galef speaking about the Soldier and Scout Mindsets at TED, in 2016.

Philip Tetlock: Expert Judgment

Philip Tetlock has done more than any other academic to help us understand the process of forecasting and making predictions. He has shown us why experts don’t do well, and, with his latest work, has found the secret sauce of ‘Superforecasting‘.

Philip Tetlock

Philip Tetlock

Short Biography

Philip Tetlock was born in 1954 and grew up in Toronto. He studied psychology, gaining his BA and MA at University of British Columbia, before moving to the US, to research decision-making for his PhD at Yale.

His career has been entirely academic, with posts at University of California, Berkley (Assistant Professor, 1979-1995), Ohio State University (Chair of Psychology and Political Science, 1996-2001), a return to UC Berkley (Chair at the Haas Business School, 2002-2011), and currently, he is Annenberg University Professor at the University of Pennsylvania, where he is jointly appointed between the School of Psychology, Political Science, and the Wharton Business School.

Tetlock’s early books are highly academic, but he started to come to prominence with the publication, in 2005, of ‘Expert Political Judgment: How Good Is It? How Can We Know?‘ This book has become highly influential, by documenting the results of Tetlock’s research into the forecasting and decision making of experts. The bottom line is that the more prominent the expert: the poorer their ability to forecast accurately.

Tetlock’s most recent book, 2015’s ‘Superforecasting: The Art and Science of Prediction‘ is one of those few magic books that can change your view of the world, make you smarter, make you feel wiser, and inspire you at the same time. It is co-written with journalist Dan Gardner (whose earlier books cover Tetlock’s work [Future Babble], and that of Daniel Kahneman [Risk]) and so is also highly readable.

The Tetlock Two-step

In ‘Expert Political Judgment‘, Tetlock is a pessimist. He finds substantial evidence to warn us not to accept the predictions of pundits and experts. They are rarely more accurate than a chimp with a dartboard (okay, he actually compares them to random guessing).

Ten years later, in ‘Superforecasting’, Tetlock is an optimist. He still rejects the predictions of experts, but he has found light at the end of the predictions tunnel. The people he calls ‘Superforecasters’ are good at prediction; far better than experts, far better than chance, and highly consistent too.

If you want to understand how to make accurate predictions and reliable decisions; you need to understand Tetlock’s work.

Hedgehogs and Foxes: The Failure of Experts

In a long series of thorough tests of forecasting ability, Tetlock discovered a startling truth. Experts rarely perform better than chance. Simple computer algorithms that extrapolate the status quo often outperformed them. The best human predictors were those with lesser narrow expertise and a broader base of knowledge. In particular, the higher the public profile of the expert, the poorer their performance as a forecaster.

This led Tetlock to borrow a metaphor from philosopher Isiah Berlin: The fox knows many things but the hedgehog knows one big thing. The experts are hedgehogs: they know one thing very well, but are often outsmarted by the generalists who recognise the limitations of their knowledge and therefore take a more nuanced view. This is often because experts create for themselves a big theory that they are then seduced into thinking will explain everything. Foxes don’t have a grand theory. So they synthesise many different points of view, and therefore see the strengths and weaknesses of each one, better than the hedgehogs.

One result of Tetlock’s work was that the US Government’s Intelligence Advanced Research Projects Activity (IARPA) set up a forecasting tournament. This is an ‘Intelligence Community’ think tank. Eventually, Tetlock moved from helping design and manage the tournament, to participating.

Superforecasting: The Triumph of Collective Reflection

Tetlock, along with his wife (University of Pennsylvania Psychology and Marketing Professor, Barbara Mellers) created and co-led the Good Judgment Project. This was a collaborative team that was able to win the IARPA tournament consistently.

The book, Superforecasting, documents what Tetlock learned about how to forecast well. He identified ‘Superforecasters’ as people who can consistently make better predictions than other pundits. Superforecasters think in a different way. They are more thoughtful, reflective, open-minded and intellectually humble. But despite their humility, they tend to be widely read, hard-working, and highly numerate.

In a recent (at time of writing – https://twitter.com/PTetlock/status/738667852568350720 – 3 jJune 2016) Tweet, Tetlock said of  Trump University’s ‘Talk Like a winner’ guidelines :

Guidelines for “talking like a winner” are roughly the direct opposite of those for thinking like a superforecaster

The other characteristics that enable superforecasting, which you can implement in your own organisation’s decision-making, are:

  1. Screen forecasters for high levels of open-mindedness, rationality and fluid intelligence (reasoning skills), and low levels of superstitious thinking (Tetlock has developed a ‘Rationality Quotient’ or RQ). Also choose people with a ‘Growth Mindset’ andGrit.
  2. Collect forecasters together to work as a team
  3. Aim to maximise diversity of experiences, backgrounds, and perspectives
  4. Train them in how to work as a team effectively
  5. Good questions get good answers, so focus early effort on framing the question well to reduce bias and increase precision
  6. Understand biases and how to counter them
  7. Embrace and acknowledge uncertainty
  8. Take a subtle approach and use high levels of precision in estimating probabilities of events
  9. Adopt multiple models, and compare the predictions each one offers to gain deeper insights
  10. Start to identify the best performers, and allocate higher weight to their estimates
  11. Reflect on outcomes and draw lessons to help revise your processes and update your forecasts

 

Tetlock Explaining Fox and Hedgehog Theory

Victor Vroom: Motivation and Decision-making

Why do people make the choices they do at work, and how can managers and leaders make effective decisions? These are two essential questions for managers to understand. They were both tackled with characteristic clear-thinking and rigour by one man.

Victor Vroom

Short Biography

Victor Vroom was born in 1932 and grew up in the suburbs of Montreal. Initially, he was a bright child with little academic interest – unlike his two older brothers. Instead, his passion was big-band jazz music and, as a teenager, he dedicated up to 10 hours a day to practising Alto Sax and Clarinet.

Leaving school, but finding the move to the US as a professional musician was tricky, Vroom enrolled in college and learned, through psychometric testing, that the two areas of interest that would best suit him were music (no surprise) and psychology. Unfortunately, whilst he now enjoyed learning, his college did not teach psychology.

At the end of the year, he was able to transfer, with a full year’s credit, to McGill University, where he earned a BSc in 1953 and a Masters in Psychological Science (MPs Sc) in 1955. He then went to the US to study for his PhD at the University of Michigan. It was awarded to him in 1958.

His first research post was at the University of Michigan, from where he moved to the University of Pennsylvania in 1960 and then, in 1963, to Carnegie Mellon University. He remained there until receiving a second offer from Yale University – this time to act as Chairman of the Department of Administrative Sciences, and to set up a graduate school of organisation and management.

He has remained there for the rest of his career, as John G Searle Professor and, currently, as BearingPoint Professor Emeritus of Management & Professor of Psychology.

Vroom’s first book was Work and Motivation (1964) which introduced the first of his major contributions; his ‘Expectancy Theory’ of motivation. He also collaborated with Edward Deci to produce a review of workplace motivation, Management and Motivation, in 1970. They produced a revised edition in 1992.

His second major contribution was the ‘Vroom-Yetton model of leadership decision making’. Vroom and Philip Yetton published Leadership and Decision-Making in 1973. He later revised the model with Arthur Jago, and together, they published The New Leadership: Managing Participation in Organizations in 1988.

It is also worth mentioning that Vroom had a bruising experience while pursued through the courts by an organisation he had earlier collaborated with. They won their case for copyright infringement so I shall say no more. The judgement is available online. Vroom’s account of this, at the end of a long autobiographical essay, is an interesting read. It was written as part of his presidency of the Society for Industrial and Organizational Psychology in 1980-81.

Vroom’s Expectancy Theory of Motivation

Pocketblog has covered Vroom’s expectancy theory in an earlier blog, and it is also described in detail in The Management Models Pocketbook. It is an excellent model that deserves to be far better known than it is. Possibly the reason is because Vroom chose to express his theory as an equation: bad move! Most people are scared of equations. That’s why we at Management Pocketbooks prefer to use the metaphor of a chain. Motivation breaks down if any of the links is compromised. Take a look at our short and easy to follow article.

The Vroom-Yetton-Jago Model of Leadership Decision-making

This one is  a bit of a handful. Vroom has expressed some surprise that it became a well-adopted tool and, more recently, noted that societies and therefore management styles have changed, rendering it less relevant now than it was in its time. That said, it is instructive to understand the basics.

Decision-making is a leadership role, and (what I shall call) the V-Y-J model is a situational leadership model for what style of decision-making a leader should select.

It sets out the different degrees to which a manager or leader can involve their team in decision-making, and also the situational characteristics that would lead to a choice of each style.

Five levels of Group Involvement in Decision-making

Level 1: Authoritative A1
The leader makes their decision alone.

Level 2: Authoritative A2
The leader invites information and then makes their decision alone.

Level 3: Consultative C1
The leader invites group members to offer opinions and suggestions, and then makes their decision alone.

Level 4: Consultative C2
The leader brings the group together to hear their discussion and suggestions, and then makes their decision alone.

Level 5: Group Consensus
The leader brings the group together to discuss the issue, and then facilitates a group decision.

Choosing a Decision-Making Approach

The V-Y-J model sets out a number of considerations and research indicates that, when a decision approach is chosen that follows these considerations, leaders self-report greater levels of success than when the model is not followed. The considerations are:

  1. How important is the quality of the decision?
  2. How much information and expertise does the leader have?
  3. How well structured is the problem or question?
  4. How important is group-member acceptance of the decision?
  5. How likely is group-member acceptance of the decision?
  6. How much do group members share the organisation’s goals (against pursuing their own agendas)?
  7. How likely is the group to be able to reach a consensus?

A Personal Reflection

I have found both of Vroom’s principal models enormously helpful, both as a project leader and as a management trainer. I find it somewhat sad that, in Vroom’s own words, ‘the wrenching changes at Yale and the … lawsuit have taken their emotional and intellectual toll.’ Two major events created a huge mental and emotional distraction for Vroom in the late 1980s. At a time when he should still have been at the peak of his intellectual powers, he was diverted from his research. I think this is sad and wonder what insights we may have lost as a result.


 

Pocketbooks you might Like

The Motivation Pocketbook – has a short introduction to Vroom’s Expectancy Theory, which it refers to as ‘Valence Theory. It also has a wealth of other ideas about motivation.

The Management Models Pocketbook – has a thorough discussion of Expectancy Theory, and also Motivational Needs Theory, alongside eight other management models.

 

 

Philip Green: Risk & Control

Sir Philip Green is rarely out of the news. A self-made business man, he has long been a dominant figure in the UK retail scene and a figure with much to admire and much to criticise. When a TV audience is split 50:50 in loving and loathing a programme, it usually becomes a hit. On those grounds, Philip Green is a business hit!

Sir Philip Green

Short Biography

Philip Green was born in the Surrey (now South London) town of Croydon in 1952, where his parents were both involved in property and retail businesses. At the age of twelve, while at boarding school, his father died, leaving his mother to continue to run the family businesses – something she carried on into her eighties. Green, who had been used to earning money from a young age on the forecourt of the family’s petrol (gas in the US) station, left school as soon as he could, to enter the world of work.

His endeavour allowed him to work his way up through all levels of a shoe importer, to discover a real talent for selling. When he left the company he travelled the world, learning practical lessons in business, which he brought back to the UK. Through the late 1970s and early 1980s, he became adept at deals involving buying stock nobody else wanted and selling it quickly. He operated primarily in the retail apparel market. He then turned this acumen towards buying and selling companies. His deals got larger and more profitable, his reputation for rapid deal-making grew, and so did his asset base.

In 2000, Green acquired BHS – the former British Home Stores – which he rapidly transferred to his wife, Tina Green. He followed this acquisition in 2002 with the purchase of the Arcadia Group of fashion retail companies, that included some of the big names on the UK high street: Topshop, Burton, Wallis, Evans, Miss Selfridge and Dorothy Perkins. This was also transferred to his wife’s name. As a Monaco resident, the tax implications of this ownership structure have attracted much criticism in the UK.

Already owning the second largest share of the UK clothes retail market, Green tried in 2004 to acquire Marks & Spencer – the largest clothes retailer. His bid failed with much vitriol between him and the then M&S boss Sir Stuart Rose. In 2006, Green was knighted for services to the retail industry. The 2010 general election saw him coming out strongly for the Conservative party – a move that was reciprocated by the new Conservative/Liberal coalition with his appointment to chair a review into Government procurement – of which he was highly critical.

Perhaps Green’s largest business was BHS, so his business story is not one of total success. By 2012, the company’s fortunes were waning and in March 2015, Green sold the now loss-making business – debt free but with substantial pensions liabilities – for £1.

As a multi-billionaire (with his wife), Green’s spending and tax affairs attract as much media attention as his business activities. He is famed for lavish parties (spending several million pounds at a time) and equally known for his charitable and philanthropic spending. Forbes rate the couple’s joint wealth in 2015 at US $5 billion.

Business Lessons from Sir Philip Green

Whatever your view of him, Sir Philip has a talent for making decisions and turning a profit. Here are some lessons I draw from his experiences and choices.

Pace and Decisiveness

Green built his business on fast deals: rapidly doing the deal (often making a multi-million pound acquisition in days) and quickly turning that deal into a profit. Yes, Green is adept at risk taking, but taking risks is not a secret to success. Quickly assessing the risk and understanding your own capacity to handle it is what matters, and Green was a master – particularly during the 1980s and 90s.

The Rich get Richer

Money begets money, and Green used a very simple ploy (conceptually) time and time again, to grow his wealth. He would convince banks to lend him money to make his acquisitions – of stock in the early days and of businesses later – and then turn a profit and repay his debt quickly. On one occasion in 1985, he bought a bankrupt business with a large loan, traded for a short while and sold it six months later for nearly twice as much as he’d borrowed.  He then went to his bank and asked ‘what do I owe you?’ They replied ‘3 million 430 thousand pounds’ and so Green wrote a cheque there and then, putting it on the counter and saying ‘Done.’

Discipline and Control

Green has a fiendish attention to every detail of his business, devoting much of his energy to driving efficiency into every last nook and cranny. Why did BHS fail, then? I wish I could ferret that one out, because his regal processes through his London Oxford Street empire of shops are well known within the business for ferreting out even tiny discrepancies in the selling process.

Customers first: Owners second

Perhaps Green’s most closely held business belief is that shareholders drive the wrong decisions. Everything should be about giving your customers what they want, rather than pandering to shareholders. This is why he turned both BHS and Arcadia from publicly listed to privately owned companies. Maybe it is also why BHS failed for him: he could no longer figure out how to give customers what they want in a general purpose multi product store. It will be interesting to see if and how its new owners can square the circle that Green could not.

And…

Of course there are other things too, but most of them are what any manager would tell you are obvious ‘no-brainer’ habits; like: know your business inside out, respect and trust your people, keep working hard, stay alert for opportunities, and protect your supply chain. But the fact that Green does all of these does not make him different from many other successful business leaders. It’s the fact that he does them well and consistently, on top of the differentiators that make him exceptional.

Daniel Kahneman: Judgement and Bias

Daniel Kahneman has won many awards and honours, but none more surprising, perhaps, than a Nobel Prize. Why is this surprising? Kahneman is, after all, one the most eminent and influential psychologists of his time. It is surprising because there is no Nobel Prize for psychology: Kahneman was co-recipient of the 2002 Nobel Prize in Economics ‘for having integrated insights from psychological research into economic science, especially concerning human judgement and decision making under uncertainty’.

In short, what Kahneman taught us was that, before he and his co-worker, Amos Tversky (who sadly died six years before the Nobel Committee considered this prize and so was not eligible), had started to study human decision making, all economic theories were based on the same, false assumption. Kahneman and Tversky taught us that human beings are not rational agents when we make economic decisions: we are instinctive, intuitive, biased decision makers.

And, if that sounds pretty obvious to us now, then we have Kahneman and Tversky, and their long walks together, to thank.

Daniel Kahneman

 

Short Biography

Daniel Kahneman was born in 1934 to Lithuanian emigré parents living in Paris (although he was born when they were visiting family members in Tel Aviv). When Nazi Germany occupied France, the family went on the run, ending up after the war in what was then (1948) Palestine under the British Mandate, shortly before the formation of the State of Israel.

In 1954 he gained his BSc from the Hebrew University, in Psychology and Maths, and joined the psychology department of the Israeli Defence Forces, helping with officer selection. Four years later, he went to the University of California, Berkeley, where he was awarded a PhD in 1961. He returned to the Hebrew University in 1961.

It was in 1968, while hosting a seminar, that he met Amos Tversky. They started collaborating shortly afterwards. Their fertile discussions often involved thought experiments about how we make decisions and judgements, uncovering in themselves a series of heuristics – or thinking shortcuts – which they went on to observe in controlled laboratory experiments. Their collaboration continued until Tversky’s death in 1996.

In that time, they collaborated with other researchers, most notably, Paul Slovic and economists Richard Thaler and Jack Knetsch. Their many insights into how we make judgements and the application to economic decision-making eventually led to the Nobel Committee recognising Kahneman with the 2002 Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel.

Kahneman’s 2011 book, Thinking, Fast and Slow is a summary of a remarkable life’s work. If the ideas are new to you, they may well rock your world. It is not an easy read, but it is remarkably well-written for an intelligent lay audience. Even if Kahneman’s work is familiar to you, this book will repay close reading.

Kahneman’s Ideas

There is far too much in Kahneman’s work to even begin to summarise it, so I want to focus on three biases that he discovered, which have a profound impact on the choices we make; often leading us far astray.

The Anchoring Bias

The first information we get biases any subsequent choices we make. Your father was right, or your mother or anyone else who told you at a young age that first impressions count. Systematically, the human brain takes the first information it receives, and creates an interpretation of everything else that is anchored in the inferences it draws from that first impression. In management terms, this accounts for the horns and halo effect, that biases us to seek and spot confirming evidence for our pre-existing assessment.

The Representativeness Bias

Who is a more likely person to find working in a car repair shop, changing your brakes? Is it A: a young woman with blond hair and pink eyeliner, or B: a young woman with blond hair and pink eyeliner, whose father owns the car repair shop?

If you think B, you have fallen for representativeness bias. The story makes more sense in our experience, doesn’t it? A young woman with blond hair and pink eyeliner is not a person you’d expect to see in that environment. But a young woman with blond hair and pink eyeliner, whose father owns the car repair shop, may feel right at home. But statistically, this is rubbish. For every young woman with blond hair and pink eyeliner, only a small proportion will also have fathers who own a car repair shop.

The Availability Bias

Recent events bias our perception of risk. They are more available to recall and hence have a stronger impact on our intuition than do counter examples. The classic example is perceptions of risk of train travel, after a train crash. Trains are safe: they rarely crash. Cars crash a lot: there are many accidents every day. But they are rarely reported, so we have no immediate intuitive sense of the statistics.

The Impact of Kahneman’s Work

Kahneman’s work has had a huge impact. Decision theory existed before he came along, but he and Tversky revolutionised it. But it was Kahneman, along with Tversky, Knetsch and Thaler who pretty much invented the discipline of behavioural economics – and perhaps the relationship that drove that development was the friendship between Thaler and Kahneman.

Now Behavioural Economics infuses much of public policy and social influence that corporations try to exert over us. Thaler’s book, Nudge (with Cass Sunstein) is a best seller and Thaler and Sunstein both advise Prime Ministers and Presidents. Next time you get a document from Government, or go into a store, and you find yourself complying with their wishes without thinking, there is a chance that you have been ‘nudged’. And the roots of these ‘choice architectures’? The roots are in understanding our heuristics and biases. And that was Kahneman’s contribution.

Kahneman at TED

Here is Daniel Kahneman, talking about how we perceive happiness and discomfort.

 

Team Decision Making

The Management Pocketbooks Pocket Correspondence Course

This is part of an extended management course. You can dip into it, or follow the course from the start. If you do that, you may want a course notebook, for the exercises and any notes you want to make.


Managers often need to reach decisions as a part of a team; either as:

  • a member of a management team
  • a facilitator of their own team

In both cases, it will serve you well to understand some of the do’s and don’ts of team decision-making*.

Group Think

In the 1970s, the social psychologist Irving Janis examined how groups make decisions. He found that the group’s dynamic often inhibits exploration of alternatives. People find disagreement uncomfortable, so the group seeks consensus before it is properly ready. As the group approaches consensus, dissenting voices are rejected (and, indeed, often self-censored). Janis said:

‘Concurrence-seeking becomes so dominant in a cohesive group that it
tends to over-ride realistic appraisal of alternative courses of action.’

When we fall prey to Group Think, decisions tend to be based on ‘what we all know’ – members feel inhibited from challenging the consensus and relevant information, ideas, challenges are not fully introduced.

The group tends to a higher collective confidence in a decision than individuals have in the same decision made individually. Groups tend to endorse higher risk decisions than the individuals would – perhaps due to the degree of confidence resulting in group members agreeing to decisions that they would not make as individuals. This is called Risky Shift’.

Other features of Poor Group Decision-Making

People with more extreme positions are more likely than others to have clear arguments supporting their positions and are also most likely to voice them. This enhances risky shift.

The order in which people speak can also affect the course of a discussion. Earlier comments are more influential in framing the discussion and moulding opinions.

Once people have expressed an opinion in a group, it can be hard, psychologically, for them to change their mind.

Charismatic, authoritative and trusted individuals can also skew the debate around their perspectives – which will not always be objective or ‘right’.

Finally, it takes time for a group to discuss a topic and time is often at a premium. There will be pressure to curtail discussion and move to a decision.

Towards Better Group Decisions

  1. Start with a diverse team.
  2. Don’t let leaders, experts or charismatic individuals state their opinions or preference up front
  3. Start with a round robin of facts, data and evidence. Follow up with another round robin of comments, questions and interpretations of that evidence. This forms a solid base for discussions.
  4. If you must take a vote, put it off until after discussion and then ideally, do a secret ballot to establish the balance.
  5. Appoint a devil’s advocate to find flaws in data and arguments.
  6. Before a decision is finalised, ask everyone to take the position of a critical evaluator and look for errors, flaws and risks.
  7. Divide the team into subgroups to discuss the issues, and have them debate the decision.
  8. Invite outsiders into the team to create greater diversity of thinking and overcome prejudices and confirmation bias.
  9. Give all team members equal access to raw data, so they can reanalyse it for themselves.
  10. Facilitate the discussion to ensure every voice is heard and respected – even the least senior and least forceful members of the group. If they deserve their place in the group, consider their perspectives to be of equal value.

Further Reading 

  1. The Decision-making Pocketbook
  2. The Wisdom of Crowds

* Grammatical Note

To apostrophise do’s or not?

  • In favour of not apostrophising is that it is neither a contraction nor a possessive term, suggesting that there is no good grammatical reason for introducing an apostrophe
  • In favour of the apostrophe is the core function of punctuation to improve readability. The apostrophe stops it being dos and don’ts.

We sometimes forget that grammatical and punctuation ‘rules’ evolved to codify standard usages, but that language is fluid and grammar must serve the primary purpose of aiding communication.

By the way, you’ll see that I did not apostrophise 1970s.

If you think I should either have written dos, or found an alternative (thus subordinating words and meaning to style and correctness)… Sorry.