One of the biggest challenges facing leaders today is the fact that they operate in a business environment defined by VUCA: Volatility, Uncertainty, Complexity and Ambiguity. You’ve probably heard the acronym before, and (I hope!) you’ve given a lot of thought to how your leadership style can flex to meet the changing demands of a VUCA world.
In spite of this, in Simple Habits for Complex Times: Powerful Practices for Leaders Jennifer Garvey Berger and Keith Johnston explain that the “C” is often overlooked. It’s even misinterpreted by many people, who think it means “complicated” instead.
But what’s the difference, really? What does “complex” actually mean for businesses today, and in what ways does it affect how they respond to issues?
Defining systems with the Cynefin framework
When considering the difference between “complicated” and “complex”, Berger and Johnston use the Cynefin framework – a leaders’ framework for decision making, created by Dave Snowden.
To really boil it down to its essence, the Cynefin framework considers predictability. How well does looking at past and present data help you figure out what might happen next?
If the cause-and-effect relationships in a given system are fairly straightforward, Snowden calls it a “simple” or “obvious” situation.
If causal links are less clear cut, but still knowable with research and testing, then the situation is “complicated”.
If there are no discernible cause-and-effect relationships, and you really can’t predict what’s going to happen next, it’s “complex”.
When you’re facing an issue, start by asking in what ways it’s occurring in a predictable vs unpredictable situation. If it’s predictable (ie simple), you should be able to figure out the next steps fairly easily – say, if there’s a problem on an assembly line which can be fixed by repairing a machine. If it’s complicated, you’ll need to spend longer identifying the probable causes, but you should be able to find a solution.
But if it’s an unpredictable (complex) situation, you’re out of the realm of the probable and into the purely possible. The issue could be arising because of many different things, and you have no way of knowing what will happen next.
To deal with issues in complex situations, you’ll need to stop having a tight focus on cause and effect. Easier said than done, of course – in fact, it goes against some of our most basic instincts as humans.
The instinct to find cause and effect
As Daniel Kahneman notes in Thinking, Fast and Slow – another essential read for anyone in a leadership role – human brains are so good at picking up cause and effect that they do it completely unconsciously. This certainly has its perks – as Berger and Johnston put it, that’s “one of the reasons we’ve been so successful as a species.”
But this pattern-finding instinct is so powerful that we often do it when we don’t have enough data. Unsurprisingly, this can lead you to make some bad calls. Perhaps you see B happening alongside C a lot, and think “well then, B causes C”, failing to realise that they actually have a common cause, A. Or maybe it crops up as bias – you (often unconsciously) view things through a faulty lens, which makes you see connections where there are none.
Our inbuilt obsession with cause and effect also shows up in what’s called retrospective coherence, “a fancy term that means that when we look back at something, we can make sense of it all.” In other words, hindsight is 20-20. Often, small things are overlooked which end up having a big impact, but it’s not the case that you would have noticed them if you’d only paid more attention. Lots of things are happening all the time – you’d just be wasting time if you tried to keep track of them all.
What is the system inclined to?
Instead of tracking everything, take a step back and ask: what is and isn’t this system inclined to? Often issues are linked to fairly low-level or nebulous things: “systems organize themselves, and they organize around particular patterns of behavior and interaction. These become habits, and those habits shape the way the system works.”
Berger and Johnston give the example of procurement – how easy is it to get equipment you need at work? If it’s a maze of red tape and unanswered emails every time you need something, you might be inclined to make a few guesses as to the cause and therefore probable solutions. But in a complex situation, “probable” isn’t on the table:
“[In] the complex world, often the solutions are not in a straight line to the problem. Theorists talk about this as “oblique,” but we think of it as “neighborly,” meaning solutions that live in the next neighborhood over from the problem. You can’t find them if you hone in too tightly on exactly what you think the problem is or if you are looking for leverage points and racing ahead to an obvious (to you) solution. So you want to collect a whole lot of ideas about what’s going on in the system, ideas that open up new possibilities, ideas that are stories unattached to particular solutions.”
So in the procurement example, you might ask: is it only certain types of requests which don’t go through? Is it a problem experienced at all levels? Do some departments have a harder time? This will lay out what the system is (and isn’t) inclined to, opening you up to a much wider range of possible interventions.
Once you know a system’s current inclinations, you can consider what you actually want them to be. And to move the system towards them, it’s vital that you have a strong learning environment.
The take-home message is this: when faced with an issue in a complex situation, resist the impulse to narrow your focus. Instead, expand it.
By drawing on the broadest possible portfolio of potential solutions, you can come up with ways to try nudging the system towards more positive things. These are the ‘simple responses’ – not systems overhauls, but discrete experiments.
Berger and Johnston summarise the different approaches to addressing “complicated” vs “complex” issues nicely:
“In the complicated, predictable world, it’s appropriate to research to find the very best solution to defend your perspective heartily. In the complex, unpredictable world, the best approach is to leave aside the idea of the best answer and reach toward the idea of creating safe-to-learn conversations, strategies, and action plans.”
Experimentation is key here. You need to think of several interventions – usually small-scale or tangentially related to the main problem – then try them out. Vitally, failure is fine. Given that a complex system is inherently unpredictable, you will make some missteps. The key is to be open and observant, and make sure you learn from every experiment.
“The key lever in a complex system,” say Berger and Johnston, “is learning; the key methods are conversation, discovery, and experimentation.”