AGL38.63▲ 0.81 (0.02%)AIRLINK129.71▼ -3.52 (-0.03%)BOP5.64▲ 0 (0.00%)CNERGY3.86▲ 0.09 (0.02%)DCL8.7▼ -0.16 (-0.02%)DFML41.9▲ 0.96 (0.02%)DGKC88.35▼ -1.34 (-0.01%)FCCL34.93▼ -0.13 (0.00%)FFBL67.02▲ 0.48 (0.01%)FFL10.57▲ 0.44 (0.04%)HUBC108.57▲ 2.01 (0.02%)HUMNL14.66▲ 1.33 (0.10%)KEL4.76▼ -0.09 (-0.02%)KOSM6.95▲ 0.15 (0.02%)MLCF41.68▲ 0.15 (0.00%)NBP59.64▲ 0.99 (0.02%)OGDC183.31▲ 2.67 (0.01%)PAEL26.23▲ 0.61 (0.02%)PIBTL5.95▲ 0.15 (0.03%)PPL147.09▼ -0.68 (0.00%)PRL23.57▲ 0.41 (0.02%)PTC16.5▲ 1.3 (0.09%)SEARL68.42▼ -0.27 (0.00%)TELE7.19▼ -0.04 (-0.01%)TOMCL35.86▼ -0.08 (0.00%)TPLP7.82▲ 0.46 (0.06%)TREET14.17▲ 0.02 (0.00%)TRG50.51▼ -0.24 (0.00%)UNITY26.76▲ 0.31 (0.01%)WTL1.21▲ 0 (0.00%)

International law: AI use in lethal weapons? | By Syed Qamar Afzal Rizvi

Share
Tweet
WhatsApp
Share on Linkedin
[tta_listen_btn]

International law: AI use in lethal weapons?

IN my recently published column, ‘Militarization of the Artificial Intelligence’ (24 Feb), I had discussed the dynamics of using Artificial Intelligence (AI) in the militarization of nuclear weapons.

Since the use of the artificial intelligence in kinetic targeting has become an important humanitarian and legal issue, a global debate has been initiated that while the role of AI has become increasingly active in military warfare, legally and ethically its boundaries must be drawn.

Thus, the core of my argument in this article is that by all means the ceding of human control to the artificial intelligence-enabled capabilities is largely undermining and eroding the International Humanitarian Law framework, thereby ’’ leaving the battle space legally ungoverned and civilians unprotected’’.

With the ban on the use of biological, chemical weapons and the blinding laser technology, an international urge is growing to follow a similar ban on the use of killer robots, lethal automated weapon systems (LAWS), including drone technology which hit the enemy target without direct human control.

The United Nations Convention on Certain Conventional Weapons (CCW or CCWC), concluded at Geneva on October 10, 1980, and entered into force in December 1983, seeks to prohibit or restrict the use of certain conventional weapons which are considered excessively injurious or whose effects are indiscriminate.

“The United Nations Institute for Disarmament Research (UNIDIR), the in-house independent research arm of the United Nations on disarmament issues, contributed by developing a primer and other briefing material for negotiators and researchers.”

Historically examining, “conventional weapons-related arms control tended to play second fiddle to strategic weaponry during the Cold War era.

This imbalance persisted, even though technology and security trends began to shift in the late 1990s.

The multilateral ecosystem for dealing with advanced conventional weaponry outside of ad hoc export control regimes—such as the 1996 Wassenaar Arrangement—remained relatively underdeveloped”.

And yet, cyber warfare is when States or other actors use offensive or defensive means against each other in the cyberspace.

As it is an activity that is not bound by geographical limitations, it becomes the subject of international law in general.

Moreover, where such hostile activity rises to the level of an ‘attack’ it is then governed by international humanitarian law because an attack initiates an armed conflict under this regime

In 2013, the UN’s Group of Governmental Experts (GGE), and the Meeting of State Parties to the Convention on prohibitions or restrictions on the use of certain conventional weapons—CCW agreed on a mandate on lethal autonomous weapon systems (LAWS).

It mandated its Chairperson to convene an informal Meeting of Experts ‘to discuss the questions related to emerging technologies in the areas of lethal autonomous weapons systems in the context of the objectives and purposes of the Convention’.

Such meetings of experts were convened three times, in 2014, 2015 and 2016, and produced reports which fed into meetings of the High Contracting Parties to the Convention.

In 2016, at the Fifth CCW Review Conference, the High Contracting Parties decided to establish an open-ended Group of Governmental Experts on emerging technologies in the area of LAWS (GGE on LAWS), to build on the work of the previous meetings of experts.

The group was re-convened in 2017, 2018, 2019, and 2020–2021.

Most importantly, the consensus conclusions allowed (in 2018) to focus 1) characterization of the systems under consideration—the so-called definitional issue; 2) aspects of human-machine interaction, which were critical to the concern about potential violations of IHL; and 3) possible options for addressing the humanitarian and international security consequences of the implementation of such systems.

The latest episode of Indian cruise missile firing into Pakistan airspace (March 9) is a dangerous precedent.

Whereas, the 10 principles for ethical AI(formulated in 2017) included applicability of International Humanitarian Law( IHL); non-delegation of human responsibility; accountability for use of force in accordance with international law; weapons reviews before deployment; incorporation of physical, non-proliferation and cyber security safeguards; risk assessment and mitigation during technology development; consideration of the use of emerging technologies in the area of lethal autonomous weapon system (LAWS) in compliance with IHL.

In 2018, States agreed to establish two inter-governmental processes on security related issues in cyberspace.

“All newly developed weapons must comply with Article 36 of Additional Protocol I which states that every state is under an obligation to review them to ensure that they do not violate international law.

The key challenges posed by autonomous weapons to international law are those of accountability and attribution.

Under the principle of distinction, combatants and civilians must be distinguished and only the former may be targeted is debatable whether machines that lack human judgment and reasoning, will be able to operate under this principle’’.

Against the backdrop of this principle, the US drones strikes in Afghanistan, Libya and Syria had killed innocent civilians.

It is further alleged that in the ongoing war in Ukraine, both Russia and Ukraine have used dangerous technologies, Ukraine has used the Turkish made TB2 drone whereas Russia has used Lantset drone, caring a loitering munition.

Broadly speaking, there are three state categories with regard to state obligation in terms of the AI use in military weapons: first come the P5+1 global powers—the US, Russia, China, France, UK and Germany, that are not interested to follow any limitation; second comes those states who want political settlements on this issue; and finally come those countries that want to limit the unbridled use of AI.

Arguably, given the growing danger of lacking the rules of accountability, it is paramount to develop a robust legal framework to actively shape the direction of the militarization of artificial intelligence before it is beyond the control and capacity of the international community.

It is why argued that ‘’highly autonomous AI systems should be designed so that their goals and behaviours can be assured to align with human values throughout their operation’’.

As per the ICJ ruling, the notion of the IHL ‘’is designed in such ways that it applies ‘to all forms of warfare and to all kinds of weapons’’, including the fourth and fifth generation supersonic missile technology weapons.

In the current scenario, the issue has gained profound impetus —owing to the ongoing cyberspace war In Ukraine between the West and Russia — also because of India’s culpable Brahmos missile launch in Pakistan’s air space on 9 March.

—The writer, an independent ‘IR’ researcher-cum-international law analyst based in Pakistan, is member of European Consortium for Political Research Standing Group on IR, Critical Peace & Conflict Studies, also a member of Washington Foreign Law Society and European Society of International Law.

 

Related Posts