The Complexity Lens
The world is a complicated mess. It's unpredictable, hard to understand and, at times, dangerous.
And I mean everything: our lives, existence, human society, international trade, geopolitics, economic competition, physical infrastructure, etc. Our lives experience time linearly but are a part of and depend on a wild mesh of systems.
Modern life is a marvel. It's a marvel not only because we have made our existence safer and more comfortable than ever but also because our cozy lives require so many complex systems to interact and function smoothly. We take these for granted, and we rarely give them a moment's thought. And that's ok. That's the whole point: since we no longer have to worry about procuring or producing our food, energy and tools, we are free to focus our time and energy on other tasks.
But with this complexity and sophistication comes frailty.
Complex systems fail. And when they do, their consequences can set off a chain reaction with enormous consequences. Most of the time, these consequences represent a temporary inconvenience, other times, they entail a massive loss of resources or even human lives.
I'm a private pilot and aviation geek. I have always been fascinated by flight and aircraft, and I can hardly think of a more complex system and with higher stakes than flight, or commercial flight to be more precise.
Think about it. A modern turbofan engine is as complicated a machine as any. And the engine is just one subsystem of many other subsystems that make a modern commercial aircraft. The wings don't get much attention, but they are a masterpiece of design and engineering. So is the fuselage. The landing gear, control surfaces, navigation and communications instruments, and cockpit are complex and sophisticated.Â
And an aircraft is itself but a single node in the complicated mess that is modern air travel. A plane cannot just take off and point its nose to wherever it wants to go. There are entire networks, protocols and subsystems that govern the operation of aircraft. Training pilots and aircrew is itself a complex activity. Flight tracking systems and air traffic control are another. And the ongoing operations that must take place and coordinated in every airport is yet another.
And the stakes couldn't be higher. Millions of people take to the sky every day, engaging in one of the most complex and high-stakes systems known to man. Most of us do it without hesitation. And modern flight is, by all accounts, a huge success.Â
In 2019, there were 20 fatal accidents out of an estimated 39 million commercial flights (including passenger and cargo traffic). That's equivalent to one fatal accident (where there is at least one fatality) for every 1.95 million flights. And safety is improving, even as the volume of passengers and flights increases as well*.Â
If the accident rate for 2019 were the same as in 2000, there would have been 35 fatal accidents. And 2019 was a notorious year because it included one of the two costly and high-profile accidents of the 737 MAX*.
If aviation is so complex, how did it achieve this level of safety?
Like all complex systems with high stakes, redundancies and safety protocols are built-in. What may be unique to aviation is how the industry learns from every accident and how that new information is disseminated.
What does this have to do with business, investing and decision-making?
We deal and interact with complex systems all the time. The stock market is a complex system, if there ever was one. That goes for both the infrastructure that keeps it running smoothly and the dynamics at play in its behavior. The US's legal and tax systems are complex, and things get even crazier if you invest across international borders.
Companies themselves are complex systems, and the business activities they engage in also tend to be complicated. So it pays to understand complex systems and how they fail.
Lessons from Those Dealing with Complex Technology
Earl Wiener, a NASA scientist and a famous aviation expert, developed rules for dealing with complex flight systems. There are several versions of these:
Every device creates its own opportunity for human error.
Exotic devices create exotic problems.
Digital devices tune out small errors while creating opportunities for large errors.
Complacency? Don't worry about it.
There is no problem in aviation so great or so complex that it cannot be blamed on the pilot.
There is no simple solution out there waiting to be discovered, so don't waste your time searching for it.
Invention is the mother of necessity.
Some problems have no solution.
It takes an airplane to bring out the worst in a pilot.
Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
Automation is meant to reduce workloads during normal conditions (when workloads are already low) but increase workloads in extreme conditions.
When automation fails, there is a significant price to pay.
Although written explicitly with aviation in mind, these rules apply to all complex systems. They point out something obvious but at times not acknowledged; there is frailty in complexity. Not only does the possibility of an error or accident increase when dealing with complexity, but the consequences of errors tend to be larger and mostly unknowable ahead of time.
If complex systems are so frail, why don't they fail more often?
Dr. Richard Cook, a research scientist and physician, wrote a short but powerful paper** on the subject while working at the Cognitive Technologies Laboratory at the University of Chicago. It is a dense, high-level view of a complex (ahem) subject. In eighteen short paragraphs, Dr. Cook instills the most critical lessons from the field of systems design. An ode to simplicity in the face of complexity.
In that short and fascinating paper, Dr. Cox explains that built-in redundancies, but above all, operators, are what keep things running smoothly. It's the people in the trenches, the ones interacting with systems that are the first and last line of defense. In a speech Dr. Cox gave at a web operations conference, he shares further insights about planning for downtime and maintenance when designing a new system because maintenance and repair are never really finished. These activities are continuous and closely integrated with the system itself, rather than discrete activities with a set start and end time.
Think about your own business's activities. A company, even the simplest and humblest of them, is a system of people, tools and activities geared to producing a product or providing a service. Next time you plan on implementing a flashy new technology, think of the project through the complexity lens. Is the project going to produce local optimizations but make the whole more fragile? What happens when it fails?
If we are trained to view business and investing through this lens, we are likely to find weaknesses and frailty, which is, of course, half the battle in a complex and ever-changing world.
**Â https://www.researchgate.net/publication/228797158_How_complex_systems_fail
***Â