Wavell Room
Image default
Concepts and Doctrine

The Age of Disorder

Chaos and complexity are the future.  When it comes, it comes in shocks, not trends. War doesn’t follow a predictable script.  To win in the Age of Disorder, we must embrace the chaos and adapt faster than our adversaries.

New World Disorder

“Whatever doctrine the Armed Forces are working on now, they have got it wrong. I am also tempted to declare that it does not matter that they have got it wrong. What does matter is their capacity to get it right quickly when the moment arrives”

Sir Michael Howard

Disorder is the defining feature of our era.  This affects not just when we fight but, more importantly, how we fight.  The evolutionary pressure of conflict requires armed forces to optimise.  It’s easier said than done.  The adaptation of militaries to change is historically erratic and, during peacetime, is impacted by multiple variables outside normal control.  This problem is not new but technological acceleration is making it starker.

This article proposes putting the primacy of adaptation at the centre of British military thinking. Our conceptual thinking about war and conflict must move away from institutional arrogance and insisting we can predict and bring order to something that is inherently chaotic and complex.  We can’t, and we won’t.  We risk optimising for the war we want instead of the war we will have to fight.  Arguing over expeditionary versus warfighting forces or drones instead of armour is a distraction. The real issue is resisting the siren call of certainty.

Adaptation, not optimisation

Change in conflict is the nature of war.  From Corelli Barnett’s Audit of War to Andrew Krepinevich’s The Army and Vietnam, military history has highlighted the failure to adapt as a critical failure.  Yet, despite its importance, it is not prioritised.  Adaption is a principle of war, but we don’t select for it in recruiting or training to develop it specifically.  With our expensive platforms, specialised systems and ‘just in time’ supply chains, Western nations have become good at designing forces around things going right.  We should probably think more about how they should perform when things go wrong.  You may believe you’re adaptable, but you’re part of a system that prioritises conformity and certainity over adaptation.

Embrace the Chaos

“You get pseudo-order when you seek order; you only get a measure of order and control when you embrace randomness”

Nassim Taleb

Why are we so terrible at prediction? The answer lies in our understanding of complex systems..  In Overcomplicated, Samuel Arbesman explores the differences between physics and biological systems.  Physics systems tend to be uniform and deterministic. They have patterns which are easy to interpret and predict.  They are complicated.  In biological systems, however, components are far more difficult to disentangle from the whole with more dependencies.  They evolve and adapt to evolutionary pressures to optimise the system’s survival. They are complex.

Complicated vs Complex 

As technology and our interactions with it increases in complexity and connection, it resembles a biological system.  War is the same.1  complex adaptive systems are defined as an ecosystem that is both complex (numerous, diverse and interrelated components) and adaptive (it has feedback mechanisms that allow evolution).

Why does any of this matter to the military?  Operating in complexity means that causality is difficult to ascertain, and relationships are non-linear (consequence is not proportional to the action).  You don’t know what will happen when you act, and even if you do, you have no idea by how much.  

Adaptation requires more than just recognising uncertainty.  We must actively embrace it.  The most effective futurists, known as ‘Superforecasters’ 2 don’t merely predict.  They use Bayesian reasoning, identifying the probability of an event based on previous evidence (priors or base rates) and updating the likelihood of the event (posterior belief) based on observation of new evidence.  They adapt through adherence to interdisciplinary thinking and experiential learning.  To thrive in warfare we must learn the language of probability.

None of this should be novel to you.  Complex warfare, uncertainty, and adaptation are well-trodden areas of military doctrine and thought.  So why is there such a large gap between what we write and what we do?

Newtonian Armies in a Quantum Age

Our brains are not hardwired for probability and uncertainty.  Humans don’t like it.  It is why society’s visionaries are ostracised loons.  Until they’re suddenly tech billionaires.  Even less so in the public sector, we must justify every pound spent.  We want loss aversion, procedure, and certainty.  We like to remember, not imagine.  We see the irregular and unpredictable and design a process to fix it.

Our military conceptual models reflect this.  They are mechanistic and logical. The ‘Newtonian approach’ to warfare sees causality as a logic chain; if we do A, then B will happen.3  Decade-long capability plans and reviews against predicted geopolitical and technological trends are examples of this.  Our operational doctrine and training follow rehearsed scripts, relying on technological overmatch to generate dilemmas for our adversaries.  We are trying to reorder the world in the image of how we think.  We treat war as a procedural problem, setting us up for defeat in the Edge of Chaos.

Organisations that have failures of imagination are organisations that die. Don’t take my word for it. Numerous Western military thought leaders have called for a change in our thinking, with various degrees of traction. General Mick Ryan asserts “a key virtue for military organisations in war must be adaptability to unexpected events“, pointing to Ukraine’s battle for adaptation and highlighting the investment required in human capital in War Transformed.4 General Stanley McChrystal made it a central tent of his leadership, stating “Adaptability, not efficiency, must become our central competency“, aiming to develop teams not optimised for skills, but ones that adapt to uncertainty.5 General Graeme Lamb highlighted needing “an operating system that can deal with these unintended and intended consequences, a system that is able to deliver the intended effects while synchronising a response for the next game play“.6  At an organisational level, the Australian Army has adopted elements of adaptive campaigning – treating warfare as a complex adaptive system and designing operational art around the adaptive cycle. In the United States, adaptive planning is being explored.7

Swans, shocks, and systems

“It is far easier to figure out if something is fragile than to predict the occurrence of an event that may harm it.  Fragility can be measured; risk is not measurable. This provides a solution to what I’ve called the Black Swan problem — the impossibility of calculating the risks of consequential rare events and predicting their occurrence”

Nassim Taleb

No one has done more to popularise the practical use of probability and uncertainty amongst laymen than Nassim Nicholas Taleb, author of Incerto.  Taleb’s thinking goes something like this: Life is significantly more complex and random than is generally believed.  You cannot optimise against the future because you cannot predict it.  This is evidenced by our inability to cope with high-impact, unpredictable shocks known as Black Swans.  The systems we develop as a society tend to overspecialise against specific outcomes, which makes them fragile.  They then shatter when exposed to shocks. By giving our systems small, unpredictable, and frequent shocks, they adapt or fail, better equipping the survivors to deal with and exploit opportunities from Black Swans.8

Conceptualising this, Taleb coined ‘Antifragility‘ as something that exhibits the property of growing from disorder.  More than just robust, antifragile systems actively adapt to stimuli to become stronger over time.  They thrive on chaos.9 Take the human body.  To prepare soldiers for the rigour of combat, we do not coddle them as it would make them fragile and unfit.  We subject their physical systems to frequent, small stressors so that they learn, adapt, and grow.  On match day, conditioned soldiers don’t break under pressure.

If you’ve managed this far, you’re undoubtedly asking, “Got it, but what are you proposing we do about it?”.

Good question.

We must create a force that not only has the resilience to survive the crucible of modern war but adapts, strengthens, and becomes more lethal in the face of its adversaries.

We must build an antifragile military.

A way forward

In Antifragile, Nassim Taleb describes numerous principles he believes can add antifragility into your daily life, from removing unnecessary complexity to the Lindy Effect. Adherents have analysed this for organisations and leadership.  For military purposes, there will be no call to revolution here.  To embed adaptive capacity into our people, processes and equipment, I offer three areas that we already have great foundations:  optionality, simplicity and learning.

These have all evolved as successful institutional practices.  To make this work, we do not need to search for clever solutions but exploit our simplicities.

Optionality

“Float like a butterfly, sting like a bee….”

Mohammad Ali

Optionality is ensuring you have options in an uncertain and complex situation.  To maximise outcomes, you must be able to hedge your bets.

The first condition of optionality is ensuring you’re in the right place at the right time to maximise your desired outcome.  History gives examples of highly successful organically evolved systems (genes, capitalism, viruses) bound by a single outcome (survival, profit, proliferation).  Despite no higher designer nor control mechanism, the aggregation of free agent behaviour allows system success in a complex environment.10  Failure is limited to one agent, but success cascades through the system.

In dynamic situations of high uncertainty, complex problems are not solved by complex solutions.  Simple principles will do.  The agent closest to the event is almost always best placed to decide.  Hierarchies often fool themselves into thinking that they make better decisions with more data and a broader understanding of the strategic picture.  But this is a fallacy.  Even AI’s incredible ability to collate, sift and present data doesn’t do uncertainty well.11 More data doesn’t mean better decisions; it means slower ones. The quality of decision-making rests mainly on experience, clarity of purpose and proximity to the action.12  To be effective, decentralised decision-making must be underpinned by the speed of action, executive authority and sharing of information.

The fundamental principle, however, is clarity of purpose, which McChrystal coins’ Shared Consciousness’.13  Ensure your intent, purpose, and boundaries are clear, then empower subordinates.  We call this Mission Command.  It is our strategic edge, and it must be nurtured.

Obvious?  Of course, but we rarely practice it.  Why?  Because the incentivisation to do so does not exist in barracks, a ‘benign’ and certain environment with defined rules.  You wouldn’t run a car wash or production line using Mission Command.  Delegating authority is not efficient – it reduces control, is expensive and carries more risk of failure in output, time, or reputation.  Getting it wrong is easily measurable; getting it right is rarely so.14  Nevertheless, if we do not prepare ourselves for the uncertainty of war during peace, when is the right time?

Information

The second condition is the right information.  Enter interdisciplinary thinking.  All specialists have mental models that allow them to understand complexity.  Economists understand behavioural incentives.  Data scientists understand probability.  Engineers’ first principles and systems thinking.  You get the picture.  In complex systems, none of them are wrong, but none are entirely correct either.  Interdisciplinary thinking is what Charlie Munger calls ‘the latticework on which to hang your mental models‘, a conceptual toolbox of cognitive tools which allows you to maximise success in novel situations.

In Range, Epstein explains ‘kind’ versus ‘wicked’ environments.  Specialist thinking – the holy grail of Western technical and academic proficiency –  allows you to interrogate data, understand rules, and repeat patterns for success in a ‘kind’, rules-based environment.  It rewards conformance within your domain.  ‘Wicked’ environments, on the other hand, are uncertain and complex.  Relying on experience from a single domain is not only limiting but has been proven to be disastrous.  Specialists operating in novel or uncertain environments double down on the mental models they are familiar with, not the ones that work.  Due to their internal biases, highly educated specialists perform worse on out-of-domain knowledge tests than the general public.15  For complex problems, analogical thinking demonstrates superior results to specialist education,16 whilst interdisciplinary thinkers have been proven better at adapting to volatility.17

Service personnel of the 21st century need to have a good grounding in military science and process and be able to synthesise technical knowledge across multiple domains whilst operating in ambiguity.  We need data scientists, physics graduates and systems engineers as much as tacticians, logisticians, and intelligence analysts. 

To paraphrase Epstein, when the going gets uncertain, its breadth that makes the difference

Don’t get taken out of the game – think like a system.

It ain’t how hard you can hit. It’s how hard you can get hit and keep moving forward” – Rocky Balboa

Simplicity, redundancy, and modularity are our friends.  Be the ant colony, not the elephant.  The elephant is robust but fragile, remove a leg, and you generate system failure.  Each ant is individually vulnerable, but it is not a point of system failure.  The colony lives on if it dies.  System failure requires the destruction of almost every ant.  Even removing the queen’s command function doesn’t stop the colony from conducting its business (in the short term).

Many of our capabilities are elephants.  Some of this will be unavoidable in newer technology, but we must constantly strive to reduce the complexity in our systems. Over time, regardless of how complex or novel the original technology is, mature industries tend to adopt a modular system architecture, where components are replaceable without breaking the overall capability.18

In military terms, modular, simple systems allow for depth and scale.  The modern battlefield is more transparent than ever, and it is impossible to hide from the proliferation of highly capable, inexpensive sensors.  Regardless of how lethal we are, we are unlikely to avoid attrition.  With depth, we can take more hits; with scale, when we take hits – and we will – we can repair or replace cheaply and quickly.  

The architecture is the easy part.  The hard part is challenging the way we think.  The ant colony models itself on the system and not platform resilience.  Western militaries recoil in horror at the thought of losing platforms.  This is fragile thinking.  The ant colony doesn’t care if a single platform is destroyed.  The system lives to fight on.

Antifragile technology is already here.  Not every piece of hardware requires a decade-long CADMID cycle with onerous requirements in infrastructure, maintenance and policy. Kit such as drones, power generation and processors can instead be consumed and replaced in weeks.  Advances in land open systems architecture and generic vehicle architecture allows modularity within newer armoured fighting vehicles.  See Think Defence’s Anglo Engineering Concept for a well-thought-out example of an open systems vehicle fleet.

Unleash your Chaos Monkey

“Everybody has a plan until they get punched in the mouth”

Mike Tyson

Systems without stressors are systems that calcify.  They become complacent, slow and fragile.  Netflix solves this through Chaos Engineering, a discipline that improves systems by testing how they behave when subjected to failure and turbulence.  The Chaos Monkey resilience tool enforces failure by shutting down critical services and infrastructure in a pseudo-random manner.  The benefit to military systems is obvious.  The phrase ‘Failure isn’t an option’ should be banned, replaced with ‘Failure is our only option’.

The training environment would be a good place to start.  War doesn’t follow a script. If we train with a script, we will fight with one.  One that our adversaries will not follow.  A good start would be better OPFOR along the lines of the Experimental Battlegroup.  It really shouldn’t need to be said that failure is an opportunity to learn, but it seems we cannot get over our obsession about not training for failure.

Personally, I have attended almost two dozen major exercises but have never been part of a defeated force.  This is in no way down to luck or talent.  We don’t train it.  Yes, our exercises are arduous, but there is no point being cold and under time pressure while conforming to the same comfortable decisions.  By not allowing ourselves the opportunity of being defeated, we are depriving ourselves of system stressors and valuable experiential learning.  If we only learn how to deal with comfortable, successful outcomes, we will lack the institutional confidence to deal with the uncomfortable and unsuccessful ones when they arrive.

Next, incentivise failure.  Organisations that say one thing and behave, reinforce and incentivise another are incoherent.  We talk a lot about failure, but nowhere in an annual appraisal report does it judge how well you have failed.  Our appraisal system incentivises caution and conformity.  Failed projects aren’t treated as opportunities for learning but as orphans no one wants.  And when no one wants to admit failure, we end up with bad projects that we become emotionally invested in and committed to long after we’ve learned valuable lessons.  A better world is one where we admit failure early, recognise the usefulness of honesty, and accept that we’ve learned how not to do it.

Lastly, a robust codifying of lessons learned from shock and failure cascading throughout the system must inform adaptation.  Distinct from just more ‘data’, which overloads decision-makers, lessons act as institutional wisdom.  The Army already has tools for this, from the Army Knowledge Exchange to the Lessons Exploitation Cell, where valuable information is collected.  What does broader Defence have?  This successful process should be further refined by looking at how frontline units interact with these lessons.  Firstly, to be usable in contact, allowing the rapid absorption of lessons by local agents to inform decision-making.  Secondly, to allow for swift and effective lateral communication, setting conditions for merging ideas to develop new innovations and refine existing ones.  Maybe lessons professionals are the chaos monkeys of the future?

Conclusion

Predicting future war is impossible.  Our current thinking is fragile, optimised for a type of warfare we have little control over.

But we can and should position ourselves for victory.  We can hedge our bets by embracing and building adaptive capacity.  In our people, through mission command and interdisciplinary skills.  In our hardware, through an emphasis on system resilience and the use of modular, simple, inexpensive platforms.  And in our processes by subjecting them to extreme pressure and codifying lessons.  We have the means to do so; all we need is the will. We must be ruthlessly and unapologetically obsessed with winning.

By making adaptation our core strategy, we generate the conditions to outthink and outpace our adversaries in battle.  We then become the moral hazard they lose sleep over. They begin to fear us.  And then we win.

There are too many possible futures for us to pick one. To win the high-stakes wars of the 21st century, we must choose them all.

 

Ben Johnson

Ben Johnson is a Major in the British Army, currently serving as 2IC of an Attack Aviation Regiment. He has operational experience in Iraq and Afghanistan has served within a US Divisional staff and DSTL. He has specialised in Strategy with Harvard Business School, gained an MSc in Battlespace Technology and is pursuing another in Digital & Cyber systems. He is a fellow for the Institute of Innovation and Knowledge Exchange.

Footnotes

  1. K. P. Schaaff & F. T. Bossio, Warfare as a complex adaptive system, 1996
  2. Tetlock, Philip & Gardner, Dan, Superforecasting: The Art and Science of Prediction, 2016
  3. Schmitt, John, Command and (Out of) Control: The Military Implications of Complexity Theory, 1996
  4. Ryan, Mick, War Transformed, 2022
  5. McChrystal, Stanley, Team of Teams, 2015
  6. https://thestrategybridge.org/the-bridge/2015/12/30/in-command-and-out-of-control-sir-graeme-lamb
  7. Chairman of the Joint Chiefs of Staff Guide 3130, Adaptive Planning and execution overview and policy framework, 2019
  8. Taleb, Nassim, Black Swan, 2007
  9. Taleb, Nassim, Antifragile: Things that gain from disorder, 2012
  10. Johnson, Steven, Emergence: The connected lives of ants, brains, cities and software, 2002
  11. Epstein, David, Range: How generalists triumph in a specialised world, 2020
  12. Storr, Jim, Something Rotten: Land Command in the 21st Century, 2022
  13. McChrystal, Stanley, Team of Teams, 2015
  14. Townsend, Stephen, Reinvigorating Mission Command: Its Okay to Run with Scissors, 2019
  15. Rosling, Hans, Factfulness, 2018
  16. Epstein, David, Range: How generalists triumph in a specialised world, 2020
  17. Tetlock, Philip & Gardner, Dan, Superforecasting: The Art and Science of Prediction, 2016
  18. Clayton Christensen’s Theory of Interdependence and Modularity.

Related posts

All about the bomb? Cold War influences on ‘Modern Deterrence’

Gareth W

Intelligence NEEDS to go Digital

John S

MDI Revisited: Evolving Towards a Revolution?

Dr Chris Tuck