Three things we’ll be watching at AWS2024 in Vienna
On 29-30 April, “Humanity at the Crossroads”, a conference on autonomous weapons and how to regulate them will be held at the Hofburg Palace in Vienna, Austria.
For more than ten years, governments have attempted to deal with autonomous weapons within the UN Convention on Certain Conventional Weapons (CCW). Plagued by a paralysing consensual decisions-making rule — a non-starter in today’s geopolitical situation — the CCW has been unable to develop the necessary international rules to tackle the issue.
The Vienna conference could be the last chance to ignite a preemptive process to develop new international rules before autonomous weapons become a regular feature on the battlefield. There are a few things we will be particularly watching out for in the grand halls of the Hofburg Palace:
1 — How will the reports of Israel’s use of AI powered targeting systems in Gaza impact the discussions?
In 2006, the humanitarian harm caused by Israel's massive use of cluster munitions in Lebanon galvanised states to negotiate an international treaty prohibiting cluster munitions at the end of that year. A few weeks ago, +972 Magazine reported that Israel is using artificial intelligence in a targeting system called “Lavender” in its war on Palestine, with horrific impact for civilians.
Will the Lavender story spur governments to action in Vienna? While commentators have taken pains to point out that Lavender is not, in itself, an autonomous weapon, the story demonstrates why new rules are needed to protect civilians from these systems:
When using the Lavender system, human personnel in the Israeli military have only seconds to confirm a target selected by the AI system— an unacceptable rubber stamp procedure, especially because the system is known to make what are regarded as “errors” in about ten percent of cases. This highlights the importance of “meaningful human control”, a core principle many supporters of a treaty prohibiting fully autonomous weapons systems believe need to be part of future regulations of autonomous weapons.
Lavender shows how AI systems can be used to avoid accountability for military personnel and leaders. In the reporting by +972 Magazine, key questions were raised about who is responsible for the decision-making process and how AI can be used to shield individuals from accountability under the laws of war.
Lavender shows how rapidly the concept of “proportionality” can change. According to Israeli sources, “for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians”. Were the target a senior Hamas official with the rank of battalion or brigade commander, the killing of more than 100 civilians in the assassination of a single commander was authorised”. These numbers are exceptionally high amongst responsible militaries and cannot be considered proportional under the laws of war.
How will states, international organisations and civil society participating in the Vienna conference confront this new reality? Will it shake up and give the talks a greater sense of urgency—or will governments seek to dodge it and shut out actual real life cases out of the conversation?
2 — How will Austria shape the conference outcome, and what will it lead to?
When Austria hosts international conferences on weapons, treaty negotiations tend to follow. From the first expert meeting on a possible ban on landmines in February 1997, to the 2007 Vienna Conference on cluster munitions, the 2014 Vienna conference on the humanitarian impact of nuclear weapons, and the 2019 Vienna Conference on Protecting Civilians in Urban Warfare, negotiations of treaties and or other instruments have followed. Austria and its conferences often play a key role in catalysing global action.
While all these instruments differ in scope, legal nature and adherence, the role of Austrian diplomats such as Thomas Hajnoczni, Alexander Kmentt and their teams cannot be underestimated when it comes to bringing about new international instruments to address weapons issues.
According to Austria, the conference “Humanity at the Crossroads” will conclude with a summary that will be submitted by Austria as input to the UN Secretary General’s report on how to address autonomous weapons. Austria has also indicated that there will be an additional “chair’s outcome document”.
Knowledgeable Spoilers will remember that the “Austrian Pledge” “to fill the legal gap” on nuclear weapons, issued at the end of its 2014 conference, quickly morphed into a Humanitarian Pledge when states around the world asked to join Austria in its pledge. The pledge became a key part of building confidence amongst a core group of states to launch a process to negotiate a treaty banning nuclear weapons.
With last week’s news that West African states support the immediate start of treaty negotiations on autonomous weapons, and with the Belgian parliament moving ahead with a national prohibition of such weapons, we’ll be closely watching what comes out of the closing session in Vienna.
Will it build momentum and enough confidence for states to launch negotiations? Or will it simply become a paper sent through the UN paper mill for the UN Secretary General’s report? Spoiler Alert will keep you updated!
3 — Is the civil society campaign ready for this moment?
With the use of AI on the battlefield in Ukraine and in Gaza, debates about autonomous weapons are rapidly shifting from a decade-long hypothetical conversation in the Geneva UN bubble to real life scenarios and, potentially, treaty negotiations.
Leveraging the growing public awareness of the dangers of AI and autonomous technologies and securing a mandate for treaty negotiations will place civil society under pressure. Throw in an increasingly powerful tech sector, new allies in the existential risk and tech crowd and mix them with the more traditional humanitarian law actors and place them all in a generally very polarised and difficult international context, and you have a very challenging time for anyone trying pop to build coalitions and alliances around international legal instruments.
After a tumultuous six months in the Stop Killer Robots campaign, will the campaign seize this moment and mobilise a larger public movement that can pull this kind of process off? And at a time when we see international law and treaties being ignored, can they make the case that this one will have an impact?
The role of civil society is absolutely fundamental in building political leadership and courage to move forward with a process to develop new international law. Without effective public pressure, issues can easily fizzle out and the normative impact of these instruments will be reduced. After 10 years in a secluded bubble at the CCW in Geneva, is the campaign to stop killer robots ready to take the issue out of its bureaucratic UN frame and make it exciting and relevant for a wider audience?
The Conference “Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation” will take place at the Hofburg Palace in Vienna, Austria on 29-30 April. It will be livestreamed here.