Present and FutureUnmanned Aerial SystemsHow AI can change the paradigm
By definition, a hunter-killer mission is a task that combines two functions, namely, "hunting" and "killing" executed in concert. Such tasks are normally performed by two different components of the same team. For instance, hunter-killer armoured vehicle teams consists of scout vehicles that hunt the targets and tanks that execute the kill. In a two persons sniper team, one hunts the target with specialised optical hardware while the other executes the precision kill with a sniper rifle. Similarly, scout and attack helicopters teams and so on.
However, the same hunter-killer mission tasks, when applied to the unmanned aerial systems (UAS) have come to assume a slightly different connotation. Given the capability of the UAS to carry out intelligence gathering, surveillance, target acquisition and reconnaissance (ISTAR) for extended periods (owing to long endurance) using state-of-the-art sensors, at the same time having the capability of precision kills by employing a variety of lethal munitions carried on board, hunter-killer missions are perceived to be executable by a single UAS platform as 'search and destroy' (or find and fix) operations.
Drone strikes as a part of targeted killing campaigns against jihadist militants are increasingly prevalent in various parts of the world, namely, Pakistan, Afghanistan, Yemen, Syria, Iraq, Somalia and more.
Drone Strikes - Lethality and Inaccuracy
There are a host of factors adding to the inaccuracy of drone strikes - limitations in the operator training, delays and inaccuracies in the communications/satellite link between the ground control station (GCS) and the UAS, limits of accuracies in the sensors on board the UAS to take an accurate target fix, technical limitations of the weapons being fired etc, to name a few.
As a reference, let us consider the Predator/Reaper class of UAS which are in the forefront of drone strikes.
The data linkages and the entire network of connectivity in the above class of UAS are normally configured on High/Very High/Ultra High Frequency (HF/VHF/UHF) over voice/data/cellular/landline telephones with hardware connectivity to the satellite communication (SATCOM) terminals.
In the above configuration, the first source of probable inaccuracy in "what is commanded" and "what gets executed" gets inserted in the loop due to an inherent delay between the moving of the joystick by the GCS operator and the UAS receiving the command.
In the Reaper UAS, this delay is reportedly about 1.2 Seconds. This time lag, given the speed of the UAS
(220 knots/390 km/h/108.42m/s) is sufficient to push the drone to almost the edge of the area where the command was given. Since the target is a singleton human being, even this much delay causes inaccuracy.
The next issue that directly affects accuracy is the capability of the sensors on board the UAS to acquire a target. A typical sensor suite on board a frontline UAS may include visual sensors for targeting, short/mid wave band IR sensors, colour /monochrome daylight TV camera, image intensified TV camera, laser range finder and designator and laser illuminator. Cumulatively, this suit is referred to as the Multi-Spectral-Targeting System (MTS).
The heart of the MTS in a modern UAS is the Synthetic Aperture Radar (SAR). Much of the accuracy of fire will depend on how accurately the SAR is able to distinguish one (target) from the other (implying humans). Experience has it that even with a resolution of 0.1 minute, the radar footprint cannot cover just "one individual". The minimal swath it covers, where it will allow the guided weapon to hit will cover 1+++. This lies at the heart of "unintended kills". It is to be understood that the sensors on board the UAS are basically designed for "surveillance" and not guiding weapons to pin-point kills of humans.
Insofar as the weapons being fired for the kill, taking the example of the Predator-Reaper, its arsenal consists of AGM -114 Hellfire Missile, AIM 92 Stinger Missile or AGM-176 Griffin Missile, Brimstone Missile (UK), GBU 12 Paveway II Laser Guided Bombs (LGBs), and GBU 38 Joint Direct attack Munition (JDAM) etc. Even if the above weapons are guided by accurate millimetric wave radars, these basically have high explosive (HE) blast/fragmentation warheads or HE anti-tank (HEAT) anti-armour charge where the explosion on the target end is guided by impact/proximity fuses. Notwithstanding the fact that most of these are precision guided munitions (PGMs), their kill zone, are not the weapons to take out one individual.
In a power-play of technology there are ongoing efforts by the front-ranking nations of the world to integrate drone reconnaissance and kill capabilities with advanced artificial intelligence (AI) and machine learning tools.
In its simplest and understandable manner, the drones of tomorrow will pick and choose their targets using the unfailing tools of "face recognition". Given the fact that the digital signatures of each face on this planet is different, and also the fact, that fairly accurate high-resolution images of the targets on the hit list are available in the "threat library (ies)"of the nations, it will be possible to pre-feed the digital signatures of intended targets in the target acquisition suit of the drone electronics.
With this done, it would now be possible to ensure that the strike is allowed only when "the intended target through face-recognition is fixated correctly". This will ensure that the collaterals due to wrong target selection are removed/reduced.
While the technology is still some distance away, as recently as Apr 18, open sources reported a teaming of Google with Project Maven of US which features an Algorithmic Warfare Cross-Functional Team (AWCFT) which is working on providing computer vision algorithms for object detection, classification and alerts in drone missions.
In fact, there is an in-house protest by Google fearing an "irreparable brand damage" if the company is seen assisting in development of "weaponised AI".
Be that as it may, AI is something whose time has already come to enter the "dull', "dirty" and "dangerous" world of drones. In that, the impregnation of AI tools as technological enablers into the UAS domain (as is with many other fields of human endeavour) is now only a matter of time (beginnings already made).
It is true that AI can actually reduce some quantum of blood-stains on the tail of drone hunter-killer missions by bringing precision and automation in the field of surveillance and target identification as well as, enabling precise and unfailing “kill decisions"
Besides bringing precision into kill-decision in the future, something which the AI can do in the "immediate" and "near future" is to bring about a paradigm shift in enhancing the surveillance capability of the drones.
In this, the AI suits, duly programmed and pre-fed with relevant information, will enable drones to "keep stations” precisely and with "unfailing machine accuracy” for long endurance hours and immediately report minutest of changes to the “default environment setting" built into their memory. The richer the threat library of the
built-in information, better will be the surveillance accuracy. This capability of image-recognition will be most useful in SWARMS of micro drones configured into gridded patterns and deployed in field for station-keeping and surveillance tasks covering large swathes of area. AI will also help the SWARM to keep the integrity of the grid alive and continue to operate even if a finite number of micro drones get shot or become non-functional for any reason, whatsoever.
In the face of ever increasing public pressure, the day is not far when such precision kill systems will be developed that negate the limitations of the current arsenal in taking out humans as single targets. There is a whole promising world of soft kill weapons waiting to make this challenge happen.
In the context of AI, it is important to flag a recent decision of Govt of India to include AI in its defence forces with an aim to enhance its operational preparedness. As recently as 21 May 18, Secretary Defence Production Mr Ajay Kumar stated, that the Ministry of Defence has set up a Task Force under the Chairmanship of Mr N Chandrasekaran, Chairman Tata Sons to finalise the specifics and the framework of the project.
It is a Multi-Stakeholder Task force (MSTF) that includes the defence forces, DRDO, Govt, BEL, experts, professionals and startups. The MSTF is initially looking at what are the kind of requirements and how the tools of AI along with robotics, internet and machine learning, which are actually the drivers of the fourth industrial revolution could combine to empower our defenseforces.The MSTF could not have been better-timed.
While AI in hunter-killer operations in our scenario may not be "knocking at the door", it is certainly "round the corner".