Experimental Feature: Audio Read Version
Military professionals are decision makers and to do that well, they need to be able to employ a variety of different decision making techniques. Can they be scientists too?
Individuals can employ numerous mindsets in numerous roles, but the influence of training on mindsets should not be underestimated. Individuals will unconsciously drift back to their training and the mindset it engendered. Particularly when faced with uncertainty.
The scientific method engenders a mindset that is contrary to that which is trained into military personnel. This clash of mindsets leads to the incorrect use of scientific advice which is detrimental to the planning and decision making process. This is normally seen when staff and commanders cede planning or decision making responsibilities to the scientific advisor or ignoring scientific advice during planning.
Like a 1970s stand-up comic playing to a Working Mans’ Club, let me start by talking about my wife. She is a Consultant at the local hospital having completed 15 or so years of clinical training. Aside from being able to go to the front of the queue when buying toilet paper, one of the results of this training is a mindset that judges the competence of a nurse by their ability to firmly place the required item in her hand without being asked and at the exact moment of need. If she has to break her thought process or eyeline to get something then the nurse needs further training. This is ideal for the clinical setting where the problems faced are limited in scope and processes are well defined. It is not so good in other settings, such as at home with a well-meaning but non-psychic husband (boom, 1970s stand-up routine done).
So, in addition to the charming insight into my domestic life, what can we draw from this?
Well not only does the mindset of the individual need to match the role and situation, but also that the influence of training on mindset is fundamental and should not be ignored. Over the past few years I have seen this mindset mis-match played out repeatedly with planning, decision making, and the scientific advice that supports it. At its worst, the planner and decision maker, like a subject in Milgram’s obedience to authority experiment, defer their roles to whoever is wearing the white coat at that moment. This results in cautious, inconsistent, plans and ponderous decision making.
Why does this happen? Only a fool would try to address this complex and nuanced issue in a short article. Well, I am that fool and you probably wouldn’t read more than 2 pages anyway.
Who does what?
A brief (and idealised) description of the role of a planner is as someone who corrals the uncertainties or risks (both positive and negative) associated with a task into a format that allows timely and effective decisions to be made and subsequent actions to be taken. The eventual plan is the result of options for the distribution of risk being created and presented to the decision maker.
The decision maker, uncluttered by the mental fog and biases that are created by detailed planning, and with judgement informed by a ‘laser beam focus on why we are doing this’, judges if the distribution of risk is appropriate for the task and issues direction accordingly.
Assumption and complexity
In general, the military use tempo to get ahead of the situation and achieve the aim. Two of the most important ways of doing this are through the use of assumption and by accepting complexity. In other words, by exploiting risk. In addition, reserves and contingency plans help mitigate things and hopefully maintain tempo when this exploitation of risk does not work out as wished. Planners and decision makers are also well versed in ‘sacrificing the sacred cow of perfection on the alter of the all-powerful Chronos’ (the god of time version, not the titan version) to achieve effect.
To enable all of this, the mindset of the planner and decision maker is optimised for exploiting risk to achieve application of effect, even if understanding is lacking.
Scientists (if such a broad term can be used to group this is a diverse range of individuals, roles and disciplines) plan, make assumptions, and make decisions every day. A wonder of the world and breath-taking example of this is the Large Hadron Collider at CERN. Scientists can obviously plan and make decisions. However, when ‘doing science’ (often involving bubbling test tubes and maths that uses letters instead of numbers) experiments are employed to test and prove a hypothesis.
In wildly, almost insultingly, basic terms, the experimental method tries to control assumptions and complexity, ideally eliminating undesirable uncertainties or risks, to allow the influence between isolated factors to be assessed and hopefully prove cause and effect. This elimination of risks takes time, uses resources, and makes any conclusions drawn less likely to work ‘outside of the lab’ when exposed to other influences. But this process does allow causal relationships to be identified and built upon.
To enable this, the experimental mindset is optimised for eliminating risk to achieve understanding, even if eventual application of effect is lacking.
Firstly, organisations must pay attention to ensure that the correct mindset is applied to the correct situation and acknowledge that training will play a large part in creating that mindset. If an individual with decades of training and experience in eliminating risk is given planning and decision making responsibility for a dynamic and time constrained situation, you should not be surprised when decisions are inconsistent and slow and that plans are over-cautious. And visa versa.
Secondly, it is the duty of the planner and decision maker to use the extremely valuable scientific knowledge that has been gained through experimentation that controls or minimises risks and to integrate it into the real world with the aim of exploiting the wider risks and generating tempo. Too often I have seen decisions inappropriately delegated to the scientific advisor in the belief that an experiment in a lab years ago is the same as life now. Experiments inform but do not replicate.
Thirdly, Commanders should also be aware of the Authority Bias and account for the fact that a Scientific Advisor may be trying to establish credibility in an intimidating organisation that they do not understand (possibly whilst re-living a childhood trauma perpetrated by the type of people they are now working alongside). The sooner trust is formed, the sooner the benefit.
Finally, I am starting nurse training in the forlorn hope that it improves my Wife’s current assessment of my competence.