UK MoD – Future Force Concept

This is our final look at the last Joint Concept Note published by the MoD. It is about the UK’s Future Force Concept and looks at how the UK military will function in the cyber, air, land, and maritime domains both independently and in conjunction with NATO and other partners.

Here’s what we thought:


Though I don’t believe this report tells us anything particularly new, it is an excellent reference work for anyone seeking to gain an understanding of ‘where the MoD is at’, and where it sees itself in relation to the world going forward. In particular, the report is broken down into the main military domains (air, sea, land, space), with additional comment on the new cyber landscape. Of course we should note here that over all of this report is hanging the ever diminishing budget for UK defence forces which have in a sense forced the hand of the MoD in some areas where we can no longer make the same level of investment that we once could.  

However, in some ways, this could be to our benefit… As the report notes, traditionally, maritime systems have been designed to last for anything up to 50 years. However, with the rapidly changing environment, such lifespans are no longer sustainable, as new technologies are fast rendering expensive investments obsolete. If the limited budgets forces us to look beyond the traditional ‘old’ way of doing things then we may be able to extract additional value from our investments, or at least be more selective in our investments, focussing on our strengths such as innovation, skills and training, and forming closer working relationships with our allies.  

Mike Ryder, Lancaster University 

 


This concept note seems to be a good analysis of how the UK military can do more with less people, less money, and less advantage over of adversaries. This note seems to be both pragmatic and optimistic. NATO seems to be the cornerstone of future British force deployments. As a Brit, it is a bit disappointing to think that the UK military is scaling down its ambitions as an independent force. But, as we know, the world becoming more complex requires lots of money, materiel, and personnel. Something that the UK government has decided not to fund. Considering that most of the threats in this note don’t require nuclear weapons, it does raise the question of whether they should be a top priority in the future. But, of course, the risk of not having them in a renewed era of state aggression could be too much for any government to take. 

I don’t know what any of the answers are. I’m not sure anyone really does. But, this concept note is a great step towards considering some answers. 

Joshua Hughes, Lancaster University


What do you think?

UK MoD – Future of Command and Control

This week we continue our look at the UK MoD’s Joint Concept Notes.

We’re looking at the Future of Command and Control this time. It’s a fascinating look into how leadership, battle management, and personnel management will function in the highly-complex futures we imagine.

Here’s what we thought:


This fascinating paper gives an insight into the MoD’s position on modern-day Command and Control and the challenges faced in the new operating environment. As with other of these papers we have looked at for TTAC, the emphasis here really is on flexibility and on greater understanding of the challenges. In particular the report emphasises the need to adjust Command and Control to a given mission and situation. While technology clearly provides many benefits to defence operations, the report also highlights the vulnerabilities that technologies bring about. Especially significant here in my view is the need to ‘maintain reversionary “off-line” modes and practices as a matter of course’ (46). 

Mike Ryder, Lancaster University 

 


The first thing to strike me in this concept note is the clear belief that interstate relations are going to return to an age of persistent competition. From a hard-IR-Realist perspective we’ve never left this situation, of course. However to think that states will move from an era of cooperation to an era of confrontation presents a significant change to the status quo. AS anyone with a basic understanding of IR knows, this could be really dangerous. Although I suppose thinking about dangers and being a bit paranoid is one of the jobs our defence industry has, to think the worst and plan for it so we can all hope for the best. 

What seemed to come through in this note is that the MoD know that they need to make massive changes to how they command and control their forces particularly in complex battlespaces and confrontations below the level of armed conflict. However, it also felt as though the MoD don’t really want to do this. Also, I get the impression that rather than trying to innovate and be ahead of conflict trends, the MoD is reacting to how conflicts have changed. Of course, one must take into account how conflicts progress, and the enemy gets a say in how conflicts roll, but it does not seem as though the MoD are trying to dominate situations and set the tone for conflicts. 

I also get the impression that command and control has taken a lot of information from management studies. If one swapped some words around, I think this document could be equally applicable for commanders as MBA students. That is not necessarily a bad thing, leadership and management are key skills in both battle and business. But I can’t help thinking that the note didn’t seem to have victory in battle as its major focus. 

Joshua Hughes, Lancaster University 


What do you think?

UK MoD – Human-Machine Teaming

This week we begin our triple-bill of Joint Concept Notes from the UK Ministry of Defence. You can see them all here. These are basically the documents which lay out how the UK military will develop in the future, and what it’s priorities and aims are over the next few years.

The first Note we are looking at is that focussed upon Human-Machine Teaming, available here. This considers how machines will work alongside people in order to get the best out of both to create the optimum operating environment for military success.

Here’s what we thought:


 

I found this to be a really insightful paper, outlining the MoD’s position on AI and robotics, and in particular, the role of the human in relation to the machine. While there are too many topics covered to address in a short blog response, I found it interesting that the report highlights the potential for technology to shift the balance of power, and to allow minor actors to increasingly punch above their weight. This then ties in with the report’s others comments about use, and the need to adapt quickly to changing demands. In the example of a 2005 chess competition, the paper demonstrates how a team of American amateurs with weak computers won, beating superior players and more powerful computers, demonstrating the importance of the interface between the human and the machine (39–40). While computer power is certainly important, such power used poorly or by unskilled operators is not a guaranteed success, and so we should not take success against ‘weaker’ powers for granted.  

I was also particularly taken by a segment in Annex A at the end of the report in which the authors address the question of autonomy. Here, the report suggests that for the foreseeable future, no machine possesses ethical or legal autonomy (57), within the scope of the report’s own definition. The report then re-states the MoD’s position from September 2017 that ‘we do not operate, and do not plan to develop, any lethal autonomous weapons systems’ (58), which is an interesting remark, given the MoD’s own definition of autonomy as describing ‘elements with agency and independent decision-making power’ (57).  

 Mike Ryder, Lancaster University 

 


This concept note is a great overview of the major issues related to the employment of AI-based technologies alongside humans in conflict situations. Something the note mentions which I hadn’t given much through to is the potential revaluation of state power not in terms of GDP, but in terms of human capital and expertise relating to robotics and AI. Whilst in my work I usually consider AI in weapon systems, that mostly relates to tactical rather than strategic advantage. Whereas considering the impact of AI in a strategic sense is something I haven’t really thought about. As the note says (para.1.9), Russia and Singapore are nations that whilst they have a modest GDP in comparison to other states, have a high level of expertise in the underlying sciences fuelling AI and robotics. This has the potential to really change the way the world works, changing the landscape of power that has dominated the world since WWII. 

Something else which caught my eye was the mention of how manufacturers can limit defence capabilities (para.1.14). By creating systems using certain techniques and methods, they become locked into hat system and might not be open to analysis or further exploitation by the military. In my research on AI in weapons, this can be problematic if the military, in particular when new systems are being tested, want to know what the underlying code does and how it works. Not knowing this can have serious impacts on military effectiveness and legal compliance. 

Whilst the note is focussed upon human-machine teams, something that stood out to me in paras 2.8-2.14 is the large number of tasks that the MoD intends to automate. To me, this seems to be reducing the human role significantly. Perhaps, then, the ultimate goal of human-machine teaming is not to have humans and machines working in symbiotic teams, but to have humans managing large machines teams instead. 

What is quite striking about this report is the similarity it has in vision to papers produced by the US military about network-centric warfare and systems-of-systems approaches to fighting wars in the 1990s. On one level it does seem like the same vision of technological superiority in warfare is just being regurgitated. However, on another, perhaps the visions is in vogue again simply because we are close to having the technologies needed to make it a reality. 

Joshua Hughes, Lancaster University 


What do you think?