Risk management for adults

Posted by Claudio Sabia on

You can find the original slides for this presentation on SlideShare. Video is now available here.

After Earth, 2013

We humans don’t like to be put in a position of uncertainty. That’s something you have observed for sure in your personal and professional everyday life.

A well established marketing rule says that too: “Thou shalt not give consumers too many options”. People will freeze during the decision process and, if forced, will either opt for the most familiar pick or give up entirely.

This dislike is so strong that we are ready (even willing!) to exaggerate coherence and consistency of what we see and know in order to “decide” fast and get out of the deadlock: we would rather be wrong than in doubt! (cf. “Thinking fast and slow” by Daniel Kahneman)

Why does this happen? It’s a behaviour that we carry on from our origins. Neuropsychology tells us it’s due to many inclinations that influence us, the so-called cognitive biases. They are an evolutionary mechanism of protection that we built over time to let us make decisions faster, without freezing in the mean time. That’s right: what helped a naked man with a blunt stone in his hands flee away from a hungry lion is still affecting us today and hindering our decision-making ability.

There are a lot of biases. Some of them are especially interesting to us.

Matrix, 1999

The first one is the Choice Supportive bias, aka “Post Purchase Rationalization Syndrome”: it’s what happens when we try to justify to ourselves that we made a good choice in buying an ignominiously expensive gadget. We are associating ex-post positive attributes to a previous choice.

The second one is the Hindsight bias. You have been there: a friend of yours (because we would never do this) claims to have foreseen the football results well after the event actually happened.

Because of this inclination, we see past events as (easily) predictable. David G. Meyers cites an enlightening study on this topic by researchers Martin Bolt and John Brink (1991):

[they] asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas. Prior to the senate vote, 58-percent of the participants predicted that he would be confirmed. When students were polled again after Thomas was confirmed, 78-percent of the participants said that they thought Thomas would be approved. – Exploring Social Psychology, D. Meyers, McGraw-Hill, 1994 (pp. 18)

Without bothering famous researchers, Sherlock Holmes often said that everything is obvious once explained…

Weekend at Bernie's, 1989

The third one is the Normalcy bias, by which we tend to underestimate both the probability and the effects of an unknown adverse situation: we let the lack of information block the processing of an action strategy. A resounding example from the depths of history is the Pompei eruption in 79 A.D.: during the smoke and ash eruption of the Vesuvio volcano, people kept going, business as usual, ignoring the possibility of a disaster. Disaster that promptly happened in the form of a deadly explosion two days after.

The last one is the Ostrich effect, that causes us to ignore negative information by pretending it does not exist. The most powerful and common effect of this bias is procrastination of what we don’t know or like.

All these biases are clearly related to people as individuals, but do they manifest themselves in more complex entities such as teams or Companies? You bet: it’s a little bit like transitioning from psychology to sociology! We can map these biases to some business mechanics that most of us have encountered sooner or later in our career.

The Hitchhiker's guide to the galaxy, 2005
  • As Billy Wilder once said: “Hindsight is always twenty-twenty”. Precise estimates, long term detailed planning, assumptions on markets trends and customers’ behaviours: they are all largely unpredictable processes on which Companies swear and build their roadmaps since forever, against all evidences.

  • Companies willingly try to ignore risk in their everyday management. And when they see it, more often than not they believe to be able to transfer it for free. As a well known example, think about the false self-confidence given by contracts that allegedly allow to blame someone else for the failure of a project or try to force ungodly restrictive clauses and unbalanced constraints on time and costs. It’s all driven by the Ostrich effect.

  • There is always the temptation to follow the known path to avoid responsibilities or surprises eventually coming as effects of change. You can often see sole use of big-name supplier firms: they do it for generic efficiency reasons, enforcing vendor consolidations and perpetuating smooth purchasing channels. That’s Choice Supportive bias at work and it’s so powerful that’s actively used in marketing campaigns. Just remember Microsoft talking about (lack of) ROI and reliability of open-source software and the funny IBM FUD (albeit not directly ascribable) “Nobody ever got fired for buying IBM equipment”.

Donnie Darko, 2001
  • Finally, Normalcy bias induces the refusal to act upon issues and accidents, trusting fortune instead (!!). It sounds astonishing that fortune can be summoned in professional environments, but that’s the case. Things will fix themselves sooner or later, against all evidence. And when things do go bad (because they’ll do!), well, it’s bad luck for sure or… someone else’s fault. Kin Hubbard nailed it long before: “Lots of folks confuse bad management with destiny”.

To put all the above examples into context, I can tell you the tale of a project for the creation of a B2B2C platform for a blue chip international company. Everything starts with a very good idea from the marketing department about a new value-added service to be conveyed to their customers through the existing infrastructure. Time-to-market is essential since competitors are planning similar solutions, albeit not as innovative as this one. So a launch date is scheduled and set as an unmovable constraint. Things immediately start to get a bit hairy.

Eyes wide shut, 1999

Decisions on which supplier to contract and even which infrastructural solution to employ protract for months, since some internal stakeholders have preferences for big, known suppliers regardless their weak technical proposals [Choice Supportive bias]. This delay is followed by even more delay since the Procurement Office is unwilling to accept the revenue-sharing deal we are proposing to keep initial costs low, as requested.

After winning the deal, we are now the only stakeholder pushing for clear, shared milestones, needed to avoid raising the costs of development or at worst failing to deliver altogether by the launch date. The other stakeholders from marketing to the technical departments, unfortunately, seem more busy with economical and technical details of the contract, completely ignoring the looming launch date and the amount of work required to get there. This is so underestimated that no penalty clause for late delivery is forced on us [Ostrich effect]!

At last, we are able to start working on the project, already saddled with an unrecoverable amount of delay. To mitigate this, we immediately propose an incremental, iterative approach to delivery, starting with the most prominent and innovative features to gain visibility at launch. Surprise! Marketing department is strongly against this approach, requiring that every feature be present at launch even if delays have already wasted more than half the time needed to develop the solution. They repeat their mantra: “Because the project kicked off, it’s a supplier problem now” [Ostrich effect again].

We also find an unexpected reluctance to collaborate from the other stakeholders. We start to continuously deliver the platform components, calling all departments for feedback and a proactive attitude to be able to shorten our development times. Despite the importance of this project (having a strong top management commitment, public media exposure with tons of advertising and budget availability), it takes a lot of time to get answers, if at all [Normalcy effect].

Star Trek: The motion picture, 1979

Fortunately, over months this behaviour slowly changes, thanks to a lot of convincing on our part. The launch date is missed (as predicted) and we go live with a subset of the original features (since we cannot bend space time, not yet!).

Still, the project is deemed a success, which is certainly true from a technical point of view, having deployed a robust solution that delivers customer value. And, apparently, from a business point of view: our client is first on the market and customers love the service.

I say apparently because competitors have been beaten only because they got delays too and, while we have been able to avoid a disaster, efficiency has suffered greatly: the same results could have been obtained with half the time and less budget.

The icing on the cake comes from a few people that are now saying they are pleased of their choices of the smaller supplier and the iterative approach [hindsight bias]!

I think it’s pretty clear these biases enjoy ample diffusion and everyone suffer them in a more or less pronounced manner. Fortunately, psychology comes again in our help providing a few antidotes that we can apply to overcome them: we can increase our evaluation accuracy by making the decisional process non-automatic and non-mechanic (debiasing).

Socrates, 1971

We have unknowingly already applied one antidote, identifying and naming our biases: the principle of wisdom consists in giving a name to things (Socrates).

The second antidote is the valorization principle (from which various techniques like impact mapping and business analysis are derived). We want to treat the objective part of risk (the danger) instead of the subjective one (the fear): a (honest) risk analysis can help extract concrete elements to base our decisions on, avoiding paralysis. At this point we can now decide to ignore information or consequences deliberately rather than blindly.

The third antidote is responsibilization: empowering our teams, requiring self and active accountability in an environment where trust is fostered and widespread.

Indiana Jones and the temple of Doom, 1984

Now that we have the decisional process reworked by getting information at hand, we can consciously move along the identified dimensions of risk, applying some known concepts in a combined way. This is going to appear obvious (your biases at work!) but will help us enhance our risk management activities with pragmatic elements, like focus on people.

Collaboration: as remarked everywhere, by the Agile Manifesto and other sources, success of projects is determined both by Customer and Supplier who share the same common good. I mean working together in a partnership, at least, and formulating contracts with a risk-sharing approach, when possible.

Transparency: problems and unforeseen accidents need to be shared and managed as soon as possible, when we still have options where to steer the project. Sharing must happen at the earliest occasion, involving all (right) stakeholders to maximize number and impact of pivoting options.

Wall Street, 1987

Real options: risk should be proactively managed: it’s not sufficient we cherry pick the options naturally available, we must continuously invest in options creation. Out of my experience, three approaches are particularly apt to churn out options.

We want to develop multiple implementations in parallel, briefly investing in alternative technological and architectural solutions when we are not sure about which one is the best. We strive to optimize for change, promoting simpler solutions now to be eventually substituted when (if) needed.

We practice deliberate discovery, the form of guided and conscious experimentation introduced by Dan North in his well known article.

Resilience: As systems grow in complexity and size, we cannot just entrust availability on robustness, especially considering the rapid pace of change that markets impose. Since we know that something will always fail and bugs will always exist, we need to transition from robustness (pretending to make components that do not fail) to resilience (making solutions that tolerate failure of some of their componentes). We want a fast and cheap recovery path to full functionality always available. We want a system that does not need eveything goes smooth in order to be a success as a whole and in every moment. We want to create an environment where it’s always safe to fail.

Bambi, 1942

I’d like to close this post with a short tale about a more recent project. I’m the father of a wonderful 1-year old girl, Eleonora. She’s still little but she has just faced an important turning point: choosing whether to stand up to be able to explore a bigger part of the world, taking the risk of falling or to remain safely on all fours, but looking at the floor. Can you guess her choice?

In any given moment, we have two options: to step forward into growth or to step back into safety – Abraham Maslow

References

Barry SchwartzThe Paradox of Choice - Why More Is Less (video)

Daniel KahnemanThinking fast and slow (video)

Tom DeMarco, Tim ListerWaltzing with bears

Tim ListerRisk Management Is Project Management For Grown-ups

Olav MaassenCommitment

Dave SnowdenManaging under conditions of uncertainty

Ryan HaniscoBuilding Agile Accountability