We need to talk about the military reporting system. To grab your attention, I could suggest and then seek to prove that the military appraisal system is broken or fundamentally flawed. But I don’t believe it is. As a think piece this article identifies three problems with the current system: limited accountability, grading boards, and its readiness for the future. It then presents some practical recommendations to improve it. This article does not seek to completely overhaul the military reporting system, rather, it aims to make constructive suggestions to make it more effective.
How does it Currently Work?
In the current appraisal system each service person has a 1st Reporting Officer (1RO). In most cases the 1RO is their immediate line manager. The 1RO writes a performance and potential narrative and make recommendations on promotion and employability. The report is then read by the subject and sent to the 2nd Reporting Officer (2RO). The 2RO is more senior and gives a broader perspective on the subject’s potential employment. Critically, the 2RO insert carries more weight when read by a promotion or appointment boards. A recognised challenge is ensuring that the subject is known sufficiently by the 2RO who may only meet the subject a handful of times over the reporting year.
Theme One: Limited Accountability
The first problem is the means by which the 2RO narrative is delivered. For example, the only errors in my reports have been in the 2RO narrative. Common faults have been spelling mistakes (including my name), exercises I didn’t attend, recommendations for job roles I’d not expressed an interest in, and incoherence between the narrative and the recommendations. This is not a case of demanding to be written on more favourably, rather, more accurately.
The point is not that reporting officers don’t know their subjects enough or that mistakes are limited to 2ROs. The issue is that the 2RO narrative is not seen by the subject before a board is finalised and therefore cannot correct it until it is too late. The service personnel centres can return reports that fail policy (by including inappropriate comments) but the best person to identify errors is the subject.
The second point stemming from this is the subject, and by extension the board, can read between the lines of a 2RO narrative. Yet the subject has no formal means of questioning the most important part of their report. Of course, a subject can appeal after the report has been finalised but dragging a report back to your 2RO with red pen all over it is unlikely to encourage a positive response.
Defence policy enables challenge. Subjects are encouraged in Joint Service Publication (JSP) 757, the defence reporting manual, to address reports that are “inaccurate or open to misinterpretation” with their 1RO. And yet on the part of the report that holds most sway, there is no accountability. Paragraph one to JSP 757 states: “The appraisal model used by Defence is designed to be open” and it is open. But by the time a subject sees the 2RO paragraph it’s too late. A similar theme lack of accountability was found in the Army’s punishment system recently as well. This does nothing to inspire confidence and often results in a negative effect on the subject.
It should be noted here that in the Royal Air Force it is routine for 2RO’s to deliver their reports to the subject. The idea that a subject’s report should be finalised in secret is not commonplace across defence.
Theme Two: Grading Boards
The second adjustment this article proposes is to the grading board and the grading process generally. The grading process and boards are not fair, their reasoning is concealed, and they are too subjective.
I’ve sat on many unit level grading boards and the arguments that have dictated the career progression, or regression, of both soldiers and officers. These have included phrases such as ‘they look the part’, ‘she is top because I like her’, and ‘he’s got loads of time left, drop him down’. I have also seen a Commanding Officer present his Sergeants’ grading list as a finished product without any input from subordinate commanders. Some officers are talented at representing their soldiers or officers. Others are awful. Subjects never see the full result of a grading board and are not able to scrutinise the arguments for/against their progression. Consequently, subjects have no means to argue that their grading mis-represents them. It makes them feel powerless and doesn’t treat people as the organisation’s greatest asset.
Lack of Visibility
Sticking with the 2RO’s grading board for a moment. The 2RO decides on where each subject is graded, their overall performance grade, and their appointment recommendations. The 1RO may fight to endorse the subject but the 2RO makes the overall decision. And it drives how the 1RO narrative is written. 2RO’s can demand that a report be re-written because a potentially accurate narrative didn’t reflect the arbitrary grading board from several months before. In the current system, if you are misunderstood by your 2RO, very little can help you. Others already point out that reporting officers promote those in their own image with little knowledge of the skills they may bring.
Fundamentally there is a lack of visibility and the process of subjective, unaccountable grading that drives the writing of the report and the management of our people must be re-thought. The Military Secretary’s Career Management Boarding Manual states ‘At the heart of a fair, transparent, and auditable [career management] system is the Army Boarding process…it is a highly regarded system, maintaining TRUST in it is key to good Career Management.’ However, if the grading board that started the process was arbitrary, and the 2RO narrative was ambiguous, then the wrong people will be appointed or promoted. This is a weak link in what is otherwise a decent system.
Theme Three: Preparing for the Future
A good friend got a good report as a sub unit commander. He was looking after his people and meeting the Commanding Officers’ intent. However, he was graded fourth of five commanders in the unit and it broke his morale. That subjective assessment of his grading could easily have been omitted. A board would have been able to select him for promotion or appoint him to a chosen role without it. But the damage to his self-esteem of including it was palpable. In the era of Functional Knowledge, Skills and Experience (FKSE) to come, there is even less justification for arbitrarily grading individuals.
This subject lost out in a simple culture which could only define best/good/bad. If a career in the Armed Forces in the 21st century was simply a race to the top then filtering the best from the rest would be applicable. However, as Programme Castle (an organisational attempt to redefine measuring talent) is demonstrating the future requires the organisation to value (and reward) multiple, different, non-traditional talents. Defence wastes talent that was ‘written off’ earlier in their career through arbitrary gradings like this example. By grading that OC as 4 of 5, his ability to compete for jobs in the future that he may be very suited to is unnecessarily diminished.
To paraphrase how another Wavell Room article put it, without changing to a skills based reporting system, defence will lack the skills needed for the future. Disenchanted personnel will leave when their skills are not recognised and this creates a ‘death spiral‘ in which more junior people are promoted in place. 360 reporting may also provide a more structural solution to this problem. A bit far, perhaps. But can defence truly say that without reform the military appraisal system is fit for the future?
Chief Defence People in the foreword to JSP 757 says “it is the duty of every commander, at every level, to make sure that they understand what appraisal means for their own career”. I submit that the grading process and the sequence of the 2RO narrative fails that direction.
I propose three recommendations:
Firstly, amend the chronology of the appraisal so that the subject can see and comment on the 2RO narrative before the report is finalised.
Secondly, grading boards should produce accurate transcripts and be accessible to the subject. This would encourage a more logical justifiable reasoning and a “fair, transparent, and auditable CM system”.
Thirdly, let’s create more nuanced grading boards that are not bound by “Best/Good/Bad”, but skills and experience. Military reporting needs a greater focus on talent management not just looking after the best. This is critical for success in the 21st century.
The ability to enact these recommendations comes up against the age-old issue of strict hierarchical structures. Those who can change the system are the very ones who have benefitted from it. We should approach this issue head on with an honest approach to feedback. We need to talk about military reporting.