Swarms of robots with the ability to kill humans are no longer only the stuff of science fiction movies like the Terminator but an ‘in your face’ reality today. Lethal Autonomous Weapons Systems (LAWS) have arrived on the battlefields across the globe. Recent armed conflicts have witnessed an increasing use of weapons that make their own judgment on target selection and engagement. These weapon systems have been employed in Libya, Syria and in Nogorno Karabakh in Armenia. These weapons appear to be miniature versions of remote-controlled drones that the world saw being deployed extensively in battles in Iraq and Afghanistan. Loitering munitions on the other hand have a built-in explosive and destroy themselves on impact with their target, rather than releasing missiles via remote control. In Ukraine, Russian military has possibly deployed an artificial intelligence (AI)-enabled Kalashnikov ZALA Aero KUB-BLA loitering munition[1], while Kyiv has used Turkish-made Bayraktar TB 2 drones known to possess certain autonomous capabilities.[2] Last year, a United Nations report suggested Turkey had used autonomous firing by its Kargu-2 drones to hunt fleeing soldiers in Libya’s civil war[3].
With use of Machine Learning and increasingly complex AI algorithms the efficiency and lethality of autonomous weapons is but bound to increase and fully autonomous systems will be capable of assessing tactical context and deciding on appropriate action. Some LAWS already have the loiter capability and the ability to identify, recognise and track targets including enemy tanks, artillery systems, radar systems and even specific individuals. These weapon systems vary vastly in size and complexities of their autonomy. While the Turkish Kargu 2 drone weighs just 15 pounds, the US AI enabled fighter jet like L -39 Albatros and X-47 B combat drone are full scale fighter jets[4]. In the maritime domain, US has tested an Anti-Submarine Warfare Continuous Trail Unmanned Vehicle (ACTUV), nick named the Sea Hunter, from San Diego California to Pearl Harbour Alaska, designed to travel the oceans for months at a time with no crew onboard, searching for enemy submarines and reporting their location and findings to remote human operators. In future such ACTUVs may be capable of autonomously capable of attacking submarines in accordance with sophisticated algorithms[5]. Similarly Russia is reported to be trial evaluating autonomous tanks in Syria. The advent of LAWS is thus being dubbed as the most promising one in the list of RMAs.
The excitement in this new field of technology has spurred new research, prototypes, and increasingly operational autonomous weapons with increasing degrees of autonomy and lethality. Not only increasing autonomy of weapons is being observed, autonomy is being added to existing weapons systems as well, from surveillance, tracking, identification, target selection and designation and engagement and damage assessment. While AI by itself means very little but the possibilities, both progressive and egregious, that it holds are limitless. No wonder then, that global military spending on AI and LAWS is witnessing substantial growth and is anticipated to grow significantly over the years. The global autonomous weapons market was valued at $11,565.2 million in 2020, and is projected to reach $30,168.1 million by 2030, registering a CAGR of 10.4%[6]. BAE Systems plc, Israel Aerospace Industries Ltd, Kongsberg Gruppen ASA, Lockheed Martin Corporation, MBDA, Northrop Grumman Corporation, Rafael Advanced Defence Systems Ltd, Raytheon Technologies Corporation, Rheinmetall AG, Thales Group and Chinese Defence Sector enterprises are some of the key players operating in the global autonomous weapons market. Moved by the importance of AI, China in 2017 released its strategic plan in the form of ‘New Generation AI Development Plan 2030 to become world leader by 2030[7]. In Jan 2023 China has commissioned a giant AI enabled drone carrier ship called Zhuhaiyun that carries AI enabled UAVs, USVs and UUV simultaneously to monitor its surrounding and produce vast amount of data[8].
Removal of humans from the decision making loop on the battle field is the cause of much opposition to the development and use of LAWS globally. LAWS can also lower the conflict threshold due to the absence of human conscience and wisdom from the loop leading to unintended conflicts. Evil as these may be, just like many other banes of technology, LAWS are there to stay and India needs to contemporize itself to this new devil.
India needs to develop this technology as of yesterday for two reasons. Firstly, India is in a state of perpetual conflict with its two neighbours who work in collusion against her. Having AI enabled platforms can make all the difference between victory and defeat at tactical level and conserve precious human resources. While the western adversary is already catered for India needs to catch up in this field with China as a technology asymmetry with China doesn’t augur well for her operational preparedness. Also an India on the path of economic development and becoming a global power cannot afford to be dragged down by numerous internal conflicts, where this technology can be used to ensure prompt and efficient resolution of these conflicts. Secondly, due to the ethical, moral and legal issues related to LAWS, an AI restrictive regime will come into being sooner or later and the rules of the regime will be decided by states in possession of this technology. As is their wont the rules will be restrictive not for them but for the ‘have nots’. India certainly doesn’t want herself in that uncomfortable situation.
Some analysts who have real concern with low end and high end proliferation of lethal autonomous weapon systems have argued that the world needs to make rules to regulate them by taking a leaf out of the nuclear playbook. While the new confident, assertive India is no more the India of yore, it would serve well to remember that though India had the technological prowess to develop nuclear weapon during the lifetime of Homi J Bhabha but vacillated over it due to moral considerations till 1974. By then the NPT had kicked in and India was made an international pariah. Later in 1998 after Operation SHAKTI, India had to again face economic and technological sanctions for couple of years.
While the United Nations Secretary-General António Guterres has repeatedly urged states to prohibit weapons systems that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable[9],US, UK and Russia have refused to ban lethal autonomous weapon system on one ground or the other as they do not want to forego any technological advantage that they have over their adversaries[10]. US has touted the LAWS to be having not only military advantage but also humanitarian benefits[11]. The stand of China on the issue has also been ambiguous. While the countries of the global south are for a total ban or strict legal regulation of LAWS. India’s nuanced stand on the issue has been that there is a need to undertake steps to prevent technological disparity between states that encourages wider armed conflicts due to the expectations of fewer casualties by the side possessing superior technology. So far the talks in the UN have been at primitive levels and under the convention for certain conventional weapons. However, sooner than later there will be an international regulatory framework like the NPT Regime for nuclear weapons to ban or curb the development and usage of LAWS. It is pertinent that India develops this technology for its land, air and maritime forces on a warp mode to earn the rights to sit at the international high table to make decision regarding the usage, restrictions and regulations as a ‘have’ rather than a ‘have not’ of this critical technology which has already started to shape battlefields around the world.
***************************************************************************************************************
References:
[1] Zachary Kallenborn, Russia may have used a killer robot in Ukraine. Now what? 15 Mar 2022, Bulletin of American Scientists, available at https://thebulletin.org/2022/03/russia-may-have-used-a-killer-robot-in-ukraine-now-what/?utm_source=Newsletter&utm_medium=Email&utm_campaign=ThursdayNewsletter03172022&utm_content=DisruptiveTechnologies_KillerRobotInUkraine_03152022
[2] Robert F Trager & Laura M Luca,, Killer Robots Are Here—and We Need to Regulate Them, https://foreignpolicy.com/2022/05/11/killer-robots-lethal-autonomous-weapons-systems-ukraine-libya-regulation/
[3] Final report of the Panel of Experts on Libya established pursuant to Security Council resolution 1973 (2011), Pp 17, available at https://documents-dds-ny.un.org/doc/UNDOC/GEN/N21/037/72/PDF/N2103772.pdf?OpenElement
[4] Vivek Wadhwa & Alex Salkever, Killer Flying Robots Are Here. What Do We Do Now? 5 Jul 2021, Foreign Policy, Available at https://foreignpolicy.com/2021/07/05/killer-flying-robots-drones-autonomous-ai-artificial-intelligence-facial-recognition-targets-turkey-libya/
[5] Michael T Klare, Autonomous Weapon Systems and Laws of War, Mar 2019, Arms Control available at https://www.armscontrol.org/act/2019-03/features/autonomous-weapons-systems-laws-war#endnote01
[6] https://www.alliedmarketresearch.com/autonomous-weapons-market-A13132
[7] In Their Own Words: A Next Generation Artificial Intelligence Development Plan, available at https://www.airuniversity.af.edu/CASI/Display/Article/2521258/in-their-own-words-new-generation-artificial-intelligence-development-plan/
[8] Liu Zhen, Giant Chinese drone-carrying AI ship enters service as research vessel, The South China Morning Post, 13 Jan 2023 available at https://www.scmp.com/news/china/science/article/3206781/giant-chinese-drone-carrying-ai-ship-enters-service-research-vessel
[9] UN News, 25 Mar 2019, available at https://news.un.org/en/story/2019/03/1035381
[10] Damien Gayle, UK, US and Russia among those opposing killer robot ban, The Guardian, 29 Mar 2019, available at https://www.theguardian.com/science/2019/mar/29/uk-us-russia-opposing-killer-robot-ban-un-ai
[11] Final Report, National Security Commission on Artificial Intelligence, available at https://www.nscai.gov/wp-content/uploads/2021/03/Full-Report-Digital-1.pdf
The opinions expressed in this article are the author’s own and do not reflect the views of Chanakya Forum. All information provided in this article including timeliness, completeness, accuracy, suitability or validity of information referenced therein, is the sole responsibility of the author. www.chanakyaforum.com does not assume any responsibility for the same.
We work round the clock to bring you the finest articles and updates from around the world. There is a team that works tirelessly to ensure that you have a seamless reading experience. But all this costs money. Please support us so that we keep doing what we do best. Happy Reading
Support Us
POST COMMENTS (7)
Online Games
İkinci el rolex saat alanlar
LtCol KSN Sukumaran
Rajeev Kumar
Kalidan Singh
Arvind kumar
DS Bhattacharya