Musings on Strategic Investigation, Performance Improvement, and Rhetoric

Gracious Investigation


“Happy is your grace that can translate the stubbornness of fortune into so quiet and so sweet a style.”
Shakespeare, As You Like It
“Let us be just to him”
Dickens, Dombey & Son
A particular human quality enhances the otherwise emotionless rigour of analysis.   When this quality is present, we get to understand matters more completely, make better-informed decisions and increase our chances of getting agreement.  When it’s absent we get an incomplete perspective, weakly informed decisions, and maybe grudging agreement.  The best word I can muster for this quality is graciousness.
I promise I’m not going to preach, pontificate or pretend that I embody graciousness.  But I hope what I’m going to say is relevant to clear thinking and good decisions, and maybe even insightful.
First, here’s what I mean by graciousness: being generous to another person’s perspective if it sits uneasily with our current one.  I’m not talking about thoughtlessly accepting someone’s opinion or pacifying pretend agreement.  I am talking about allowing someone else’s position to challenge ours.
I’m not advocating graciousness as self-sacrificing altruism.  I truly believe that graciousness benefits the thinking of the person being gracious.  By giving a fair hearing to arguments against our initial suppositions we can grow beyond the constraints of our conscious and unconscious beliefs.  When doctors believed “bad humors” caused disease, they didn’t wash their hands after handling cadavers.  Sometimes, their next task would be delivering a baby.  They changed this fatal habit only when they allowed germ theory to usurp their old mindset.  This ultimately gracious acceptance of a challenging view made them better doctors.
The self-serving benefit isn’t entirely inside our own minds.  Graciousness can help us have more constructive debate.  If we graciously welcome other views, conversations become more of an open dance than a defensive fist fight.  Counterparts might even want to dance with us again, and not duck away from us like when were a stubborn, self-justifying, graceless smarty-pants.
I haven’t been able to find any scientific studies to back up my assertions about graciousness aiding understanding; but the process of science itself is a convincing example.  You know the approach: start with a thesis; challenge it with different perspective, an antithesis; then look at the evidence and logic for each; and finally come up with a new thesis, a synthesis, that’s better than the one you both started with.  Continue repeating the process for the betterment of humankind until the end of your particular golden age.
Try taking this approach without being gracious about the antithesis.  Oversimplifying the extremes of history, we seem to have a choice: gracious consideration of challenging views (golden age, renaissance, freedom of expression, democracy), or defensive dismissal of those challenges (dark age, reaction, broadcast dogma, dictatorship).
Graciousness also helps us better turn our analysis into action.  We become better company: more accepting and more acceptable.  We’re good to bear when proven right, and happy when proven wrong.  And when we make decisions, others will more likely embrace them.  
Anecdotal evidence is everywhere for how graciousness makes good company and acceptable leaders.  Think of the people whose personalities you most admire, whose company you would seek, and whose advice you would follow.  I’ll bet they’re gracious.  It’s not entirely for his insights that Mandela is invited so many dream dinner parties.
Everyday evidence about the consequences of lack of grace is also plain.  Look at the ranting graceless nitpicking comments below many online articles, and see how quickly the discussion deteriorates into defence, attack and ad hominem attacks.  I’ll bet you don’t respect the ranters, that you find it difficult to accept their good points.  I’d guess there’s about zero chance you’ll follow their advice.  Even the excellent Socrates, clinical but graceless, ended up with a choice of exile or hemlock.
That’s enough about not being defensive.  I also want to clarify that being gracious isn’t the same as defensiveness’s opposite: being a doormat.  Graciously accepting challenge gives us permission to graciously challenge others.  We can then occupy the firm ground of listening, considering and agreeing or disagreeing graciously, rather than the easy, low extremes of shutting up shop or murmuring martyrish acceptance.
Though I’m convinced it’s worthwhile, I find it a tough quality to adopt: emotionally harder than defensiveness, and mentally harder than pretending to agree.  I find it even harder the more heated the situation.  Maybe this required wherewithal is our biggest self-imposed barrier to enjoying graciousness’s merits.  I’ll let Hemingway make a final emotional appeal that works on me:
“By ‘guts’, I mean grace under pressure.”
Comments

There’s an 80% Chance That Your Analysis is Wrong, and You Know It

In an interview on the excellent Econtalk podcast, Nassim Taleb, the epistemologist and author of the best-selling books The Black Swan and Fooled by Randomness, gave a statistic that blew me away.

The results of 80% of epidemiological studies cannot be replicated.

In other words, when a research scientist studies the reasons for the spread or inhibition of a disease, using all the research tools at his disposal, and is peer-reviewed sufficiently for his results to be published academically, then there is a four-out-of-five chance that predictions using that theory will be wrong, or useless because of changed circumstances.

Taleb gave some innocent, and some less than innocent, reasons for this poor performance.

On the innocent side of things, he raised a couple of human thinking biases that I’ve talked about before: narrative fallacy and hindsight bias. In normal language this combination says that we’re suckers for stories, and when we look at a set of facts in retrospect we force-fit a story to it and we assume that the story will hold in the future. Worryingly, as the amount of data and the processing power increase, then there is an increasing chance of finding accidental and random associations that we think are genuine explanations of what is going on. In a classic example of this, there’s a data-backed study that shows that smoking lowers the risk of breast cancer.

On the less-than-innocent side of things, we can of course use data to fool others and ourselves that our desired theory is true. Taleb is less kind, calling it the “deceptive use of data to give a theory an air of scientism that is not scientific”.

Even more worryingly, if peer-reviewed epidemiological studies are only 20% replicable, then I dread to think about the quality of the 99.99% of other, significantly inferior, analyses we use to make commercial, personal and other life decisions.

So what is Taleb’s solution if we aren’t to be doomed to be 80% likely to be wrong about anything we choose to analyse? He advocates “skeptical empiricism”; i.e. not just accepting the story, which can give false confidence about conclusions and their predictability, but understanding how much uncertainty comes with the conclusion and the reality of the breadth of possible outcomes.

At the risk of sounding pompous by disagreeing and building on Taleb’s thoughts, I’d say there are three things we can do about this if we stop kidding ourselves and admit the truth of our own biases and inadequacies. First, I think we know it when we’re actively seeking a pattern in a set of facts that suits our desired conclusion; or when any pattern we spot seems too fragile, over-complicated or hard to test. We just need to be honest about how biased we are. Second, we also need to be honest about how little we know, and how far wrong we can be, so that we can be ready for scenarios that are much higher or lower than our confidently predicted ranges. Third, we can design a test or pilot or experiment to find out how wrong or over-confident we were.

Would you rather persuade yourself and other people that you’re right, or would you rather know the truth?

Some related links:
Background on Taleb:
http://en.wikipedia.org/wiki/Nassim_Nicholas_Taleb
Script and MP3 of Econtalk’s interview with Taleb:
http://www.econtalk.org/archives/_featuring/nassim_taleb/


Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk
Comments

Put Some Emotion into Your Decision-Making and Analysis


I’m a firm believer that emotion plays a cornerstone role in any decision-making. What’s more, I also believe that strong emotion should be used to stimulate much better analysis about how to improve performance or solve a problem.

While my wife picks herself up from the chair she’s fallen off, making unflattering comparisons between problem solvers, analysts, consultants, coaches, philosophers, scientists and Mr Spock, I’ll give you some context and take some time to explain what I mean by the heresy above.

I was listening last week to a podcast featuring a German philosopher called Sabine Doring. Her area of interest is the philosophy of emotion, and its role in decision-making. In her interview, she provided three insights that got me thinking:

1. Emotions are by definition directed at something, which makes them different from moods. For example: feeling sad is a mood; feeling aggressive towards your cheating former lover is an emotion. So I can’t be an emotional or unemotional person, but I can be emotional or unemotional about a particular concept, person or decision.

2. It is ultimately your emotions that determine what matters to you when making a decision. In the most mechanical and number-driven decision-making, we still choose and give weight to different factors based on such aspects as risk-aversion (worry), time-horizon (impatience) and reward (greed). And the vast bulk of decisions, being much less mechanical, require some major value judgments. In fact, if you don’t care about your decision-making criterion, then the whole thing doesn’t matter, is irrelevant and doesn’t require a decision.

3. Recent studies by her colleagues showed that people are generally more creative when happy (counter to the art-house dogma), and more rational and analytic when depressed.

So, contrary to the truism that emotions cloud reason and need to be shoved to the backs of our minds when trying to be rational, Ms Doring’s musings lead me to a list of insights that I hope can help us become better decision-makers:

1. The better we understand ourselves, the better decisions we can make. I’m not advocating self-indulgent soul-searching here, but I am proposing being alert to and honest about the emotion that motivates each decision (and, yes, greed counts if maximising reward is number one).

2. The more we care about something, the harder we will look to find a solution or make it work. There’s a downside to this of course, that we’re tempted to overlook things that run counter to our desired result. This why one of my few personal rules is to be as emotional about finding the truth as I am about anything else.

3. Playing good cop/ bad cop, or happy cop/ depressed cop, about a decision will help you get first into the creative to search for possibility in making something work, and then into the rational in testing it. Some of the best management teams I know have permanent happy and depressed cops to create this productive balance.

So there you are: a rationale for more emotion. Hopefully, my photo above shows how emotion fits into my performance.

Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk
Comments (2)

Don't Swallow Your Own Snake Oil


Business people need to be scientific in how look at their companies, their markets and how they make decisions. If they don't, they may be lucky and thrive for a while, but they will ultimately and inevitably end up in ruin.

I want to be clear what I mean by the term “science” here. I don’t mean biology, chemistry, physics or any other examples from the school curriculum that restrict our thinking and, if I'm honest, put us off the subject. What I mean by science (in as unpretentious a way as possible) is a method and mindset of trying to find the truth of a situation or issue or problem; and caring first and foremost about finding the truth, irrespective of what that truth turns out to be.

It’s not about making an argument or proving a point. As soon as you start looking to defend a position or prove a point, then you're not a scientist, no matter what your qualifications or credentials. Mr Dawkins, looking to prove that God doesn't exist, isn’t a scientist. His antagonists, creationists trying to find evidence that He does exist, aren’t scientists either. Neither is anyone who selects information to justify themselves, rather than seeking information and testing the quality of their thinking to challenge themselves.

Therefore, you see true science exhibited more often in arenas where people need to get results, regardless of rationale or excuses, such as sport or gardening or medicine or the judge in the court room; and you see it less often where people need to be right, such as politics, interest groups, sales or the barrister in the courtroom.

Science, and the scientific method, is partly a thinking skill. It involves breaking down a problem into clear discrete component parts with an analytical knife; using crystal clear thinking to hang those parts together; making your assumptions and gaps in your knowledge explicit; using facts to test those assumptions and your draft conclusion; changing the conclusion according to what the facts say; and then challenging that new one in turn. You repeat this in a relentless process until you've got an answer with which you're satisfied. But the method isn’t something I want to go into any more here, because for most people the method isn’t the main issue.

The main issue is mindset. This mindset is about deliberately challenging your knowledge in the search for the truth of a situation and, crucially, being happy to change your view as the balance of facts dictates. It is not about collecting facts to make an argument or prove a point. This latter path is an aspect of rhetoric, which is a noble art, but it isn’t science. And unfortunately this point-proving seems to be a stronger instinct in the way our minds work than the discomfort of challenging our thinking and conclusions.

I'll give you an example of how easy it is to slip into the rhetor's mindset. I run training sessions for management consultants in the principles and practice of the consulting method, which is basically the scientific method. Everyone typically learns the scientific techniques to get to the heart of problems and crack difficult issues in a rigorous, objective and credible way. That is until I split the learners into teams and ask them, as an exercise, to give me the case for retaining or abolishing the royalty. I give the teams names: "Royalists" and "Republicans". As soon as they're given those positions my students turn from objective scientists into aggressive rhetors, searching for evidence that backs up their position. One team searches for the massive cost to the taxpayer of the royal family, while the other searches equally hard for the vital tourism income they bring to the country. They can't help themselves in this one-sided self-justifying behaviour. And I see this same behavior in myself and others every day.

Now let me come around to what all this means for business. First of all, of course there's a time for rhetoric and making an argument: whenever you're persuading someone in a sale, raising finance, or recruiting a super-star graduate. But if you want to know the truth about an issue, and make the best decision for yourself and your own business, you need the scientist’s mindset. You need to be humble about your pre-conceived notions, be open to challenge, and be prepared for the discomfort of receiving and doing the challenging. As soon as you stop doing that, and start building a fortress of facts to support your rhetoric, you're on the road to ruin, with self-justification, post-rationalisation and excuses all the way down.

So I'll leave you with a couple of questions to ask yourself about whatever issue you're facing. Are you being a scientist and trying to find the truth about whatever issue that concerns you, or are you selecting facts to make yourself feel better and prove a point? Are you trying to make the patient healthier, or are you trying to sell yourself some snake oil?

Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk
Comments

How we regularly make poor strategic decisions without even realising, and what to do about it (4/4)

In the first three posts of this four post series we described ten "cognitive traps": ways in which we think that can cause us to make damaging decisions without realising it.

The good news is that there are some decent tools to challenge these traps. We explain our two favourite approaches below, as lessons from science and sport.

1. A lesson from science – treat your beliefs as a hypothesis to be challenged

True science is not about test-tubes, double-blind tests and professors with moon-shaped glasses and speech impediments. It is about starting with a premise that you believe may be true – a hypothesis – and challenging it to see if you are right, or more likely, where you are wrong. Under this definition, you are more likely to see science from a good plumber trying to work out why your central heating makes a knocking noise than you are from a PhD nutritionist with research sponsored by High5, trying to persuade you that High5 is better than Powerade. The plumber is the scientist, challenging his hypothesis in search of the truth; the nutritionist is no more than a fundamentalist seeking and selecting evidence to support his initial position. Unfortunately, when we get attached to our ideas, the cognitive traps make us act more like the nutritionist than the plumber.

This is why the true scientist needs to adopt a mindset of challenging the hypothesis with data, and to have no belief that the hypothesis is true until the challenges show it to be so. Some practitioners even go as far as setting up a formal challenge in the form of an antithesis, an alternative hypothesis that is posited as a more accurate or insightful version of reality. This approach isn’t confined to the material and commercial – the Catholic Church appoints a devil’s advocate to provide the rigour of challenging its most important decision, the legal system applies the rigours of having separate representatives of both sides of the case.

So, how to apply this? Treat your belief as a hypothesis and challenge it, if necessary with your own devil’s advocate, to whom you give the seniority and power to challenge your decisions. And honestly expect your hypothesis to change as the evidence emerges.

Applying this lesson from science stops us being blind to the evidence at hand, but it doesn’t help us predict a future that is much more random that we think it is, or stop us being over-confident in our ability to predict it. To prepare for this, we take a lesson from sport.

2. A lesson from sport – prepare for a range of scenarios

A lesson learned by those of us who have been on the wrong end of a drubbing on the sports field or, more seriously, have experienced military action, is that no plan survives contact with the enemy. Whether you are a batsman facing a spin bowler about to treat you to one of his box of tricks, or a tennis player trying to decide if your opponent is stretched enough for you to approach the net without being passed or lobbed, you have to be able to cope with a range of scenarios. It doesn’t mean that your core game plan needs to be dictated by the opponent and environment, but it does mean that you need to be prepared for the range of scenarios that might play out. If you can’t deal with the high ball, you can guarantee that a good opponent will be sending up bombs for you to panic under all afternoon.

In business, the normal corporate downside scenario is maybe a 5% or 10% decline versus base case, which isn’t really a scenario at all, but more of a smaller version of the base case. A more useful scenario is to work out how we would still thrive if sales fell by 30% or 50%, or how we would grow if competitive substitute product X gained critical mass. How would we deal with costs? Where would we still invest, or even increase investment? Which divisions would we let go? What resources would we try to acquire? What we are not doing here is trying to create a plan for every single situation that might come about. What we are doing is stretching our thinking, in order to understand those common things we need to do to thrive in whatever scenario might come about, and preparing ourselves to respond to the inevitable unpredictability.


Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk

For the full text of this series email steve@latitude.co.uk
Comments

How we regularly make poor strategic decisions without even realising, and what to do about it (3/4)

In our first two posts in this series, we discussed how "cognitive traps" help us fool ourselves into thinking we're making rational decisions, when the opposite is true. We covered six common traps and their sometimes disastrous consequences. In this post we cover the final four of our ten traps.

Trap 7: “Overconfidence in calibration”

In this trap, people under-estimate the potential range of possible outcomes. In particular, people are often not sufficiently pessimistic with downside scenarios and/or attach too low a likelihood to major problems and pitfalls.

This issue is rife in business planning and financial projections. The more discrete and separate a business unit, the more visible is the variability; groups of partially-related businesses can appear easier to predict just because of the averaging of different under- and over-performing units. We often see business plans that overall are at or slightly below target, but consist of component businesses that show enormous variations from the original projections. For some reason, we as managers believe in our ability to perform within a tight range of projected expectations, despite this consistent evidence to the contrary.

Trap 8: “The fallacy of conjunction”

This is a trap in which people overestimate the likelihood that a series of highly likely events will all occur, and conversely underestimate the likelihood that at least one of a series of unlikely events will occur.

This leads management to believe that its mid-case scenario (which consists of all those highly likely events) is much more likely to happen that it actually is. The corollary is that it is reasonably likely that at least one of the many highly improbable, left-field, events will occur, and management will correspondingly be less likely to be ready for it.

We see this fallacy most often, again, in business planning, where a great deal of thought and preparation is given to the central scenario in the business plan, which from historic experience very rarely turns out to be true. It is why we at Latitude see business planning as a helpful process to prepare for possible futures, but see business plans as simply a means to this end.

Trap 9: “Failure of invariance”

This trap recognises that people are risk averse when prospects are positive but risk-seeking when they are negative. In a famous experiment, the vast majority of participants preferred a 100% chance of winning 500 pounds versus a 50% chance of winning 1,000 pounds. The same group preferred a 50% chance of losing 1,000 pounds versus a 100% chance of losing 500 pounds.

Companies approaching distress or who have experienced the initial failings of an investment seem to follow this risk-seeking tendency by trying ever more unlikely approaches to getting back their original money. It is almost always more rational and loss-minimising to write off sunk cost or a percent of equity, and to look at each decision on its own merits without the need to regain lost ground. Using share options as a reward mechanism can exacerbate this problem by actually making the risk-seeking rational for the individual manager, incentivising to act against the best interests of other stakeholders.

The other side of this coin, risk aversion when the company is ahead, is also very common and can lead to tremendous lost opportunity in new areas of business. This can be such a strong mindset in the team that created the company’s success that changing a winning team can sometimes be the only solution when seeking continued growth.

Trap 10: “Bystander apathy”

In this trap, people abdicate individual responsibility when they are in a crowd. Sometimes it is the apathy that stops anyone in a large crowd stopping a mugging; sometimes it is abdicating individual judgement to the perceived wisdom of the crowd. It seems that the risk of taking the contrarian path and being wrong is worse than being the anonymous lemming going over the cliff with all the others.

We see this in the various fads and booms that we fail to understand but cannot afford to miss out on. The recent “arbitrage” profits experienced in the world of private equity from ever growing P/E ratios is an example. Every Investment Director we spoke to when we surveyed them about this in 2006 knew that the P/E growth would need to stop at some stage, and admitted to stretching beyond managements’ business plans to make the investment case for purchase. But no-one felt they could afford not to keep investing. Everyone could see problems coming, and knew what would happen if they were left holding the baby when P/Es inevitably started shrinking, but to stop investing was to step out of the game.

There are numerous other cognitive traps, such as contamination effects from irrelevant data and scope neglect where we don’t minimise harm; but you probably already get the drift – people aren’t as rational as they think they are and they make irrational and potentially harmful decisions without realising it.

In our fourth and final post in this series, we suggest two ways of combating these cognitive traps, used for years in science and sport, in an attempt to throw some good sense and rationality back into the mix.


Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk

For the full text of this series email steve@latitude.co.uk
Comments

How we regularly make poor strategic decisions without even realising, and what to do about it (2/4)

In the first post of this series, we explained how "cognitive traps" trick us into making irrational, and potentially harmful, decisions without realising it. We explained traps one and two, and their damaging business and financial consequences. We will cover ten such traps in this series, and conclude the series with some suggested methods of staying rational, and countering what seems an inevitability of falling into the traps.

In this post, we cover traps three to six, and their consequences.

Trap 3: “Availability bias”

This trap causes people to base decisions on information that is to hand, usually in their memories, versus the information that they actually need, like the car driver who loses his keys at night and only looks for them under lamp posts.

We see this at its most dangerous in Board or management workshops where the day is being run on the basis that all of the important knowledge is in the room. We have even heard facilitators use this we-have-everything-in-our-heads-already as a key premise for the entire strategy that emerges.

This cognitive trap also biases us to recency and proximity – we don’t look back far enough for similar patterns or warning signs, and we don’t look far afield enough for analogous evidence of failure or success.

Trap 4: “Confirmation bias”

This trap causes people look for evidence to prove what they believe to be true, rather than looking for evidence to challenge it: why Tories read the Telegraph and Socialists read the Guardian.

We see this bias at its most damaging in investment cases for acquisitions and in business cases for investment of money and time into new ventures or projects. Even when employing a third party professional to assess the acquisition, venture or project, the investor or business manager will actually ask for affirmation or substantiation – “I’m just looking for confirmation of my hypothesis” - rather than “Challenge me and tell me where I’m wrong”.

Trap 5: “The affect heuristic”

This trap causes people to allow their beliefs and value judgements to interfere with a rational assessment of costs and benefits.

We find this most dangerous at either of two extremes: on the one hand where the decision maker is very passionate about a subject, or on the other where he is once-bitten-twice-shy.

In the former case, whilst we find it critical that managers be passionate about their products or services, this passion can blind the person to reality, and can be impossible to address without introducing a very senior individual with authority to challenge assertions with information.

In the once-bitten-twice-shy case, we have seen private equity companies abandon entire sectors following one painful loss, and refuse to entertain the most solid business case that shares even the remotest common characteristics of historic loss-makers.

Trap 6: “The problem of induction”

Induction is the process of generating a general rule from a series of observations. In the absence of clear indisputable deductive relationships, induction can be all a person has to go on. The problem of induction is that the brain will look for neat patterns and will try to create a general rule even if it is based on insufficient information.

In acquisitions, people can over-estimate the performance and prospects of a company by seeing how satisfied its customers are. This creates an overly-positive pattern from what is essentially a self-selecting group: non-customers and disgruntled ex-customers need including for the full picture. Another example of poor induction is where management projects forward on the basis of a new product’s first year’s sales and forget to consider that this first year was a golden year, where everyone without the product bought one and would never need another.

A very common area where induction knows no bounds is in the practice of regression: correlating one factor against another to create what superficially appears to be a causal relationship. For example, it is possible to infer high price sensitivity when analysing price-volume relationships, and miss the over-riding effect of heavily marketed promotions that commonly coincide with lower prices. I was humbled to the limitations of correlation when an analyst working for me at my former company determined an almost 100% correlation between pallet demand and GDP. We started to doubt the causality when the analyst realised that she had used the wrong source data and correlated UK pallet demand with Polish GDP numbers. Our confidence in the causality was damaged further when the correlation with the "correct" driver, UK GDP, was about 30% lower.

In our next post, four more cognitive traps, after which we will conclude the series with some suggested counter-measures.


Copyright Latitude 2009. All rights reserved.

Latitude Partners Ltd
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk

For the full text of this series email steve@latitude.co.uk
Comments

How we regularly make poor strategic decisions without even realising, and what to do about it (1/4)

The shocking financial consequences of how we think

Niall Ferguson’s best-selling and televised “Ascent of Money” covers beautifully the evolution of the financial system from ancient Mesopotamia to today. It is a superb book that relates the crucial role of financial tools in the growth and decline of empires and dynasties. Ironically, in amongst this excellence, the chapter that resonates most strongly is the afterword. This is called, suitably and contemporaneously, The Descent of Money.

The theme of that chapter, and this post, is how our hard-wired thought processes cause us to make decisions with a mindset that was useful for our evolution, but that in business and finance is destructive and irrational. He lists a series of cognitive traps - ways in which we make poor decisions without realising it.

We at Latitude recognised every single one of these traps, both in ourselves and in the companies we support and review. We could also relate to the considerable damage that each one could cause if unchecked.

We illustrate some of these common traps in this post and the next two posts in this series. The terminology is complex, but the ideas are simple and you will recognise every one. Even becoming aware that they exist should help avoid their destructive consequences. In our fourth and final post of the series, we attempt to go one step further in helping to steer clear of trouble and distress by proposing two well-tested answers that have been used by science and sport from their outset.

Cognitive traps 1 & 2

Trap 1: “Extending the present”

In this trap, the individual assumes that the present is good guide to the future; much better than examination of previous experience illustrates.

We see this at its most common in business planning where future revenues and costs are based on the present, plus or minus a small percentage. The reliable rule that we apply to such plans is that they will usually be wrong, though the future profit out-turn may end up being more-or-less the same with good management and a little luck. The alternative to extending the present – acknowledging that we are much less knowledgeable about the future than we think we are - is a more uncomfortable state of affairs; but this more realistic mindset can lead us to adopt valuable approaches such as scenario planning, which make us much readier for when the unforeseen does happen.

Trap 2: “Hindsight bias”

Hindsight Bias is the trap that causes people to attach greater probabilities to events after they happened than they did before they happened. Whereas in Extending the Present, people assume the present is a better guide to the future than it actually is, with Hindsight Bias, people over-rate the past as a reliable guide to the future.

When we attempt to learn lessons from the past, we must therefore make sure we cover the failures as well as the successes. For example, we can blithely look at many successful companies and conclude that they were much more focused in the services they offered than their more mediocre counterparts; that service or product focus is a pre-requisite for success in all market-leading companies. We could use this to conclude that we should rid ourselves of all products, services or skills except those that are part of this single core. However, if we look at the failures, we see that many of those companies were also very focused, but the successful ones were the minority that just happened to focus on the right thing. Looking at the full set of information, we would conclude that focus with no contingency plan is a high risk strategy, which more often than not will fail.

Hindsight Bias also leads us to project forward assuming that the models and mechanisms that worked in the past have a high probability of working in the future. We forget, or don’t realise, that what actually happened was the one of a myriad of possibilities that happened to be supported by the circumstances of the time. Stepping into the present, those myriad possibilities still exist and the chances of the future turning out as we project are much less likely than we think.

In our next post: four more common cognitive traps that help us cause destruction without even knowing we've done so.


Copyright Latitude 2009. All rights reserved.

Latitude Partners LLP
19 Bulstrode Street, London W1U 2JN
www.latitude.co.uk

For the full text of this series email steve@latitude.co.uk
Comments