Last year I wrote about the currently fashionable term, 5th Generation warfare. The reason for doing so was my sense that more and more people were throwing out the term as if the nature of war and its objectives have changed, or are changing. My point was that if by 5th- or whatever-generation war we think that the nature of war and its objectives have changed, then we are wrong. But if, by the term we mean that the conduct of war is changing, the zones of war and peace are now inseparable and non-kinetic means can be employed as weapons, then the term is right.
This is where the issue of emerging technologies comes in. At the heart of the operation of these technologies lies digital integration. They force us to think anew because they open up a wide range of problems. This is the paradox: just like the pre-digital era and its weapons, in this new age too, what we think makes us secure is also what makes us insecure.
But the digital is not just about virtual space. What happens in that space can have terrible physical consequences. A good example is cyber attacks: such attacks take place regularly even when there is no actual declaration of war in the sense of movement of troops, tanks, fighter jets etcetera.
The Prussian military officer and theorist, Carl von Clausewitz, famously talked about the fog of war. Plans can be made and hostilities begun. But how does one control the events; how does one get into the mind of the adversary; how can one figure out all the actions and reactions; how does one ensure that everything planned to a tee works that way. There are too many imponderables. Mike Tyson was once asked by a sports reporter that his (Tyson’s) adversary says that he has a plan. To that Tyson replied, “Everyone has a plan until they get punched in the face.”
War, the many battles fought, is far more complex than a boxing bout. The idea behind every military-technological development was and is (a) to gain an asymmetric advantage over the adversary and (b) to know real-time or even better beforehand, what he plans to do. Both are crucial to victory: asymmetric advantage is important to worst the enemy; information about him is vital because information at all levels is power. Military and commercial R&D take care of the first; intelligence gathering or access to real-time information takes care of the second. Neither can fully take care of the fog, but both are important to conducting war in ways that give the better side a distinct advantage.
Take the case of Pakistan Air Force’s Op Swift Retort on the morning of February 27: the Indian Air Force lost to the PAF strike package that morning because their pilots were made to lose situational awareness. In simple words, while PAF knew the battle environment in an integrated way and dominated it, the IAF went blind. Wing Commander Abhinandan couldn’t even hear the warnings from his ground control when he was shot down. The rest, as they say, is a cup of tea and history.
But the lesson is clear: the air force’s as well as ground air defence now work in an environment which is digital and integrated. The pilots operate to suppress enemy air defences (SEAD) and sometime seek to destroy enemy air defence (DEAD). Air combat is now mostly about beyond visual range capabilities combined with how well a strike package can operate in and dominate the digital environment in which contesting air forces are operating.
But this is not all.
Scholars are increasingly writing about a new wave of emerging technologies that are likely to disrupt strategic stability, in peacetime and crisis situations as much as during an armed conflict. “Emerging technologies such as cyber, lethal autonomous weapons [LAWs], artificial intelligence, additive manufacturing, stealth, synthetic biology, hypersonic vehicles, remote sensing, and distributed ledger technology are all poised to reshape the landscape of international relations.” It’s unknown territory in most cases. Many technologies may not mature or live up to their hype. But many could, and would. And they offer unique challenges. Consider.
Unlike pre-digital military systems that could be kept away from falling in unauthorised hands and whose proliferation could be largely, if not entirely, checked, most emerging technologies are contextually different. They are mass technologies (more and more states and companies are investing in and producing such technologies and capabilities; these technologies, barring some like LAWs, may not have distinctly lethal characteristics; they are useful for people as much as they are important for military purposes. In other words, how can they be controlled without disrupting economic activity, in fact, daily life itself.
Questions abound. But the common motif in all these questions, whether these technologies will be developed and used for military purposes, as much as for civilian uses, the answer is: Yes, they will be. The reason is simple: states will develop any technology, given the technology trajectories and available and developing knowledge and expertise, that can give them an advantage (whether commercial or military) or help them blunt an adversary’s advantage.
Two other issues are very important: one, the legal and the normative follow the emerging technologies. You cannot have a legal framework for something that did not exist. It is only when some technology emerges and is employed that we can fully figure out how it will, or is, impacting us. There could have been no nuclear weapons-related legal and treaty frameworks without nuclear weapons having come into play. Second, historically and empirically, technology begins to drive strategy and strategy in turn pressures the scientific community to deliver more. Ralph Lapp, a nuclear physicist involved with the Manhattan Project discussed this problem clearly in his book, Arms Beyond Doubt: Tyranny of Weapons Technology.
Shortly after the use of nuclear weapons by the US against Japan, scientists and philosophers got together in Pugwash, a village in Nova Scotia, and put together what came to be called the Russell-Einstein Manifesto, calling for complete disarmament of nuclear weapons. That didn’t happen. What did happen, though, was for the powerful states to negotiate and put together a nuclear-weapons nonproliferation treaty. Some states could retain the weapons while others were barred from taking that route. That uneven treaty rests on three pillars: coercive, legal and normative. It has had its low moments, but it has mostly held.
Whether that would be possible with emerging technologies is a difficult question to answer because at the heart of most such technologies lies a computer, fast internet connection and, in some cases, hacking skills. Until twenty years ago, FBI was in the business of chasing young hackers; today, NSA goes around recruiting them. On the non-military side, we have arrived at the point which we call Internet of Things: IOT. We have IOT in hi-tech homes. From there we have moved on to IOMT: Internet of Military Things. That’s the integrated battlefield. How does one control such an environment. The states cannot say that such digital integration can only be meant for military purposes. That would spell the death of economic activity and fail life as we know it.
For now, the Wassenaar Arrangement, an elite club of 42 countries, formed in 1996, seeks to “promote transparency and greater responsibility in transfers of conventional arms and dual-use goods and technologies.” The WA attempts to “ensure that transfers of these items do not contribute to the development or enhancement of military capabilities which undermine the goal.” It has a control list that details items and technologies that fall within the WA purview. However, as we move further down — or up — the digital track, such arrangements will come under new pressures.
Many of these technologies are also not very expensive to acquire. The cost will steadily come down further as they proliferate. It is obvious that we would need a coercive, legal and normative framework for them but at this point it is difficult to assume what exactly such a framework would entail.
These technologies will also have an impact on state-society relations. China is already experimenting with some of these smart technologies to exercise greater control over it population; Snowden’s revelations tell us the extensive NSA programmes to surveil the US population.
Integration and the digital environment has also brought its risks. Just one example: The Royal Institute of International Affairs warned in a January 2018 report that US, British and other nuclear weapons systems are increasingly vulnerable to cyber attacks. In their report, Chatham House said:
“Nuclear weapons systems were developed before the advancement of computer technology and little consideration was given to potential cyber vulnerabilities. As a result, current nuclear strategy often overlooks the widespread use of digital technology in nuclear systems…The likelihood of attempted cyber-attacks on nuclear weapons systems is relatively high and increasing from advanced persistent threats from states and non-state groups”
This is not exactly an abstract threat. Another report says that “In 2010, the US Air Force lost contact with a field of 50 Minuteman III ICBMs at Warren Air Force Base in Wyoming for an hour, raising the terrifying prospect that an enemy actor might have taken control of the missiles and was feeding incorrect information into the nuclear command-and-control networks.”
There is also the problem of artificial intelligence (AI). Research and development in the field, both in Artificial Narrow Intelligence and Artificial General Intelligence, is proceeding apace. The R&D is happening in both military and civilian-commercial sphere. ANI applications are already in existence and being used for a wide array of tasks. While the development of purely military-purpose technologies (hypersonic missiles, combat drones, LAWs etc) can be categorised and kept away from civilian uses, technologies like additive manufacturing, ANI, distributed ledger technologies offer a complex challenge in terms of control, proliferation and employment.
Finally, though certainly not conclusively, as we enter this dark territory, to use Fred Kaplan’s term, it will do us good to remember the words of Salvador de Madariaga, once chairman of the League of Nations Disarmament Commission about the direction of causality in disarmament efforts:
“The trouble with disarmament was (and still is) that the problem of war is tackled upside down and at the wrong end … Nations don’t distrust each other because they are armed; they are armed because they distrust each other. And therefore to want disarmament before a minimum of common agreement on fundamentals is as absurd as to want people to go undressed in winter. Let the weather be warm, and they will undress readily enough without committees to tell them so.”
The writer is former News Editor of The Friday Times. He tweets @ejazhaider reluctantly. This article is partially based on a presentation given at the annual conference of Sustainable Development Policy Institute, an independent think tank based in Islamabad. It is by no means an exhaustive description of the challenges posed by emerging technologies