AGL40▲ 0 (0.00%)AIRLINK129.06▼ -0.47 (0.00%)BOP6.75▲ 0.07 (0.01%)CNERGY4.49▼ -0.14 (-0.03%)DCL8.55▼ -0.39 (-0.04%)DFML40.82▼ -0.87 (-0.02%)DGKC80.96▼ -2.81 (-0.03%)FCCL32.77▲ 0 (0.00%)FFBL74.43▼ -1.04 (-0.01%)FFL11.74▲ 0.27 (0.02%)HUBC109.58▼ -0.97 (-0.01%)HUMNL13.75▼ -0.81 (-0.06%)KEL5.31▼ -0.08 (-0.01%)KOSM7.72▼ -0.68 (-0.08%)MLCF38.6▼ -1.19 (-0.03%)NBP63.51▲ 3.22 (0.05%)OGDC194.69▼ -4.97 (-0.02%)PAEL25.71▼ -0.94 (-0.04%)PIBTL7.39▼ -0.27 (-0.04%)PPL155.45▼ -2.47 (-0.02%)PRL25.79▼ -0.94 (-0.04%)PTC17.5▼ -0.96 (-0.05%)SEARL78.65▼ -3.79 (-0.05%)TELE7.86▼ -0.45 (-0.05%)TOMCL33.73▼ -0.78 (-0.02%)TPLP8.4▼ -0.66 (-0.07%)TREET16.27▼ -1.2 (-0.07%)TRG58.22▼ -3.1 (-0.05%)UNITY27.49▲ 0.06 (0.00%)WTL1.39▲ 0.01 (0.01%)

International law: AI use in lethal weapons? | By Syed Qamar Afzal Rizvi

Share
Tweet
WhatsApp
Share on Linkedin
[tta_listen_btn]

International law: AI use in lethal weapons?

IN my recently published column, ‘Militarization of the Artificial Intelligence’ (24 Feb), I had discussed the dynamics of using Artificial Intelligence (AI) in the militarization of nuclear weapons.

Since the use of the artificial intelligence in kinetic targeting has become an important humanitarian and legal issue, a global debate has been initiated that while the role of AI has become increasingly active in military warfare, legally and ethically its boundaries must be drawn.

Thus, the core of my argument in this article is that by all means the ceding of human control to the artificial intelligence-enabled capabilities is largely undermining and eroding the International Humanitarian Law framework, thereby ’’ leaving the battle space legally ungoverned and civilians unprotected’’.

With the ban on the use of biological, chemical weapons and the blinding laser technology, an international urge is growing to follow a similar ban on the use of killer robots, lethal automated weapon systems (LAWS), including drone technology which hit the enemy target without direct human control.

The United Nations Convention on Certain Conventional Weapons (CCW or CCWC), concluded at Geneva on October 10, 1980, and entered into force in December 1983, seeks to prohibit or restrict the use of certain conventional weapons which are considered excessively injurious or whose effects are indiscriminate.

“The United Nations Institute for Disarmament Research (UNIDIR), the in-house independent research arm of the United Nations on disarmament issues, contributed by developing a primer and other briefing material for negotiators and researchers.”

Historically examining, “conventional weapons-related arms control tended to play second fiddle to strategic weaponry during the Cold War era.

This imbalance persisted, even though technology and security trends began to shift in the late 1990s.

The multilateral ecosystem for dealing with advanced conventional weaponry outside of ad hoc export control regimes—such as the 1996 Wassenaar Arrangement—remained relatively underdeveloped”.

And yet, cyber warfare is when States or other actors use offensive or defensive means against each other in the cyberspace.

As it is an activity that is not bound by geographical limitations, it becomes the subject of international law in general.

Moreover, where such hostile activity rises to the level of an ‘attack’ it is then governed by international humanitarian law because an attack initiates an armed conflict under this regime

In 2013, the UN’s Group of Governmental Experts (GGE), and the Meeting of State Parties to the Convention on prohibitions or restrictions on the use of certain conventional weapons—CCW agreed on a mandate on lethal autonomous weapon systems (LAWS).

It mandated its Chairperson to convene an informal Meeting of Experts ‘to discuss the questions related to emerging technologies in the areas of lethal autonomous weapons systems in the context of the objectives and purposes of the Convention’.

Such meetings of experts were convened three times, in 2014, 2015 and 2016, and produced reports which fed into meetings of the High Contracting Parties to the Convention.

In 2016, at the Fifth CCW Review Conference, the High Contracting Parties decided to establish an open-ended Group of Governmental Experts on emerging technologies in the area of LAWS (GGE on LAWS), to build on the work of the previous meetings of experts.

The group was re-convened in 2017, 2018, 2019, and 2020–2021.

Most importantly, the consensus conclusions allowed (in 2018) to focus 1) characterization of the systems under consideration—the so-called definitional issue; 2) aspects of human-machine interaction, which were critical to the concern about potential violations of IHL; and 3) possible options for addressing the humanitarian and international security consequences of the implementation of such systems.

The latest episode of Indian cruise missile firing into Pakistan airspace (March 9) is a dangerous precedent.

Whereas, the 10 principles for ethical AI(formulated in 2017) included applicability of International Humanitarian Law( IHL); non-delegation of human responsibility; accountability for use of force in accordance with international law; weapons reviews before deployment; incorporation of physical, non-proliferation and cyber security safeguards; risk assessment and mitigation during technology development; consideration of the use of emerging technologies in the area of lethal autonomous weapon system (LAWS) in compliance with IHL.

In 2018, States agreed to establish two inter-governmental processes on security related issues in cyberspace.

“All newly developed weapons must comply with Article 36 of Additional Protocol I which states that every state is under an obligation to review them to ensure that they do not violate international law.

The key challenges posed by autonomous weapons to international law are those of accountability and attribution.

Under the principle of distinction, combatants and civilians must be distinguished and only the former may be targeted is debatable whether machines that lack human judgment and reasoning, will be able to operate under this principle’’.

Against the backdrop of this principle, the US drones strikes in Afghanistan, Libya and Syria had killed innocent civilians.

It is further alleged that in the ongoing war in Ukraine, both Russia and Ukraine have used dangerous technologies, Ukraine has used the Turkish made TB2 drone whereas Russia has used Lantset drone, caring a loitering munition.

Broadly speaking, there are three state categories with regard to state obligation in terms of the AI use in military weapons: first come the P5+1 global powers—the US, Russia, China, France, UK and Germany, that are not interested to follow any limitation; second comes those states who want political settlements on this issue; and finally come those countries that want to limit the unbridled use of AI.

Arguably, given the growing danger of lacking the rules of accountability, it is paramount to develop a robust legal framework to actively shape the direction of the militarization of artificial intelligence before it is beyond the control and capacity of the international community.

It is why argued that ‘’highly autonomous AI systems should be designed so that their goals and behaviours can be assured to align with human values throughout their operation’’.

As per the ICJ ruling, the notion of the IHL ‘’is designed in such ways that it applies ‘to all forms of warfare and to all kinds of weapons’’, including the fourth and fifth generation supersonic missile technology weapons.

In the current scenario, the issue has gained profound impetus —owing to the ongoing cyberspace war In Ukraine between the West and Russia — also because of India’s culpable Brahmos missile launch in Pakistan’s air space on 9 March.

—The writer, an independent ‘IR’ researcher-cum-international law analyst based in Pakistan, is member of European Consortium for Political Research Standing Group on IR, Critical Peace & Conflict Studies, also a member of Washington Foreign Law Society and European Society of International Law.

 

Related Posts

Get Alerts