A Ban on Lethal Autonomous Weapons



Section seven of nine of SGI President Daisaku Ikeda’s 2019 peace proposal, “Toward a New Era of Peace and Disarmament: A People-Centered Approach.”

A military unit marches to the front along Moscow Avenue in Leningrad during the siege of the city (1941) [Photo by RIA Novosti archive, image #178610/Boris Kudoyarov/CC BY-SA]

My third proposal is to establish a legally binding instrument that prohibits all lethal autonomous weapon systems (LAWS), also known as AI weapons or killer robots. Although such weapon systems have yet to be deployed, they are under development in several countries. There is growing international concern that if any country were to deploy them for military use, the impact would be equivalent to that of the advent of nuclear weapons, radically transforming the global security environment. One of the threats posed by LAWS is that they make it possible to wage combat without direct human intervention, lowering the threshold for military action and risking a dramatic undermining of international humanitarian law.

We also need to consider problems that are unique to LAWS. As pointed out in the UN Disarmament Agenda, various automated weapons capable of functioning without the intervention of an operator have been developed and used over the years—from the unmanned V-1 flying bombs of World War II to anti-personnel landmines, which remain buried in many places around the world. The agenda expresses concern over the fact that LAWS pose an entirely different level of threat: their incorporation of AI may cause them to perform “unanticipated or unexplainable actions.”

In 2014, an informal meeting of experts to discuss questions related to regulating LAWS was held under UN auspices, and this became one of the topics I discussed with the eminent peace scholar Dr. Kevin Clements. Focusing on the dangers of robotic weapons, I highlighted the threat they present from a humanitarian perspective. These weapons, when given a command to attack, automatically go on killing with no hesitation or pangs of conscience. I also reiterated the urgent need to completely outlaw such weapons before any atrocity can take place and to create a framework to ban their development or deployment.

Referring to the international Campaign to Stop Killer Robots, Dr. Clements stressed the importance of strengthening collaboration among a broad range of actors, including the UN, members of the diplomatic community and civil society. At a meeting of governmental experts in April last year, the majority of participating states agreed on the importance of retaining human control over weapon systems, with representatives of twenty-six states calling for a total ban on LAWS. I therefore urge that a conference to negotiate a treaty banning LAWS be promptly convened in order to respond to the warnings voiced in the UN Disarmament Agenda and the concerns raised at such expert meetings.

Last February, the Japanese government announced that it has no intention of developing fully autonomous weapon systems. Last September, the European Parliament adopted a resolution calling for EU members to begin negotiating a legally binding instrument prohibiting LAWS. Meanwhile, within global civil society, the membership of the Campaign to Stop Killer Robots has expanded to eighty-nine NGOs in fifty-one countries.

For our part, last October, representatives of the SGI attended the UN General Assembly First Committee, submitting two statements. One was a joint statement issued by Faith Communities Concerned about Nuclear Weapons. Endorsed by fourteen groups and individuals from different faith traditions, including Christians, Muslims, Hindus and Buddhists, the statement called for the TPNW’s early entry into force and for substantive discussions in multilateral forums on a legally binding instrument to prohibit LAWS. The other was the SGI’s public statement highlighting the serious military threats posed by LAWS and pointing out that their use “undermines the principles of human autonomy, responsibility and dignity, as well as the right to life.”

If LAWS were to be left unregulated or even actually used, the nature of combat would be fundamentally transformed. Fully autonomous weapon systems create not only a physical disconnect—the situation in which those who direct attacks and those who are targeted are not in the same place, as already seen in the case of drone strikes—but also an ethical disconnect, completely isolating the initiator of the attack from the actual combat operation.

When considering the implications of this ethical disconnect, which in some ways are of even greater concern than the military threats posed by robotic weapon systems, I am reminded of an experience described by Richard von Weizsäcker (1920–2015), the first post-reunification president of Germany. I met with the president, the younger brother of the physicist Carl Friedrich von Weizsäcker, in June 1991, eight months after the reunification of Germany. In our discussion, we talked about the dangers inherent in the kind of closed, airless societies that both Germany and Japan experienced during the 1930s and 40s.

In his memoirs, he shared the following episode. He first visited the Soviet Union as a West German parliamentarian, in 1973, paying a visit to a memorial cemetery in Leningrad (present-day Saint Petersburg) dedicated to the colossal number of Russians killed while the city was under siege by the German army during World War II. When asked to say a few words at a formal banquet that evening, President Weizsäcker confessed that he had in fact participated in the Siege of Leningrad as a young infantryman, to which the room fell silent. He told his audience that he and his fellow soldiers had been “fully aware of the suffering on all fronts but especially in this city. And now we are here to do our part to make certain that future generations will never have to repeat our experiences.” The silence gradually gave way to a feeling of human warmth.

If fully autonomous weapon systems were to be used in actual combat, would it be possible for former enemies to experience the kind of encounter that President Weizsäcker describes? Would there be any room for deep remorse over one’s actions, a poignant sense of powerlessness in the face of war or a personal resolution to dedicate oneself to peace for the sake of future generations?

related article Friends of the TPNW Friends of the TPNW Daisaku Ikeda proposes a “friends of the TPNW” group of like-minded states to deepen debate on the nuclear problem, toward promoting ratification of the Treaty on the Prohibition of Nuclear Weapons. I, too, visited that memorial cemetery in Leningrad in September 1974, the year after President Weizsäcker. As I placed flowers at the monument, I offered heartfelt prayers for the repose of the deceased and renewed my vow to work for peace. When I met with Soviet Premier Alexei Kosygin (1904–80) on the final day of my stay in the country, I mentioned my visit to the cemetery. The premier responded that he had been in the city at the time of the siege and fell silent as if recalling the horrors of that time. That moment initiated a candid and open-hearted exchange of views between us. I can still picture the earnest look on the premier’s face as he related his conviction that we must first relinquish the very idea of war if we are to tackle the global challenges facing humankind. My own experience helped me grasp how uniquely valuable and important the interactions between President Weizsäcker and the Russian people must have been.

In his memoir, President Weizsäcker vividly describes his experience of war:

“Since all the men facing each other across the battle lines worried chiefly about their own survival, we can assume that our foes were not so different from ourselves. . . I remember a silent night march in long lines in which we suddenly sensed, coming in the other direction, an equally silent line. We could barely make each other out, and yet we realized abruptly that the others were Russian. Now the crucial point for both sides was to keep calm, so we felt our way past each other in silence and unscathed. We were supposed to kill each other, yet we would have preferred to embrace each other.”

In a world of AI-controlled weapon systems, is there any chance that we would be able to “keep calm” in the face of the complicated feelings that cross the lines of friend and foe, sensing the weight of humanity bearing down upon us, and thus be able to hold off, even for a moment, the decision to attack?

It is certainly important to discuss restrictions on LAWS in light of the imperatives of international humanitarian law—such principles as protecting civilians in times of conflict and prohibiting the use of weapons that cause unnecessary suffering to combatants as well as the obligation to determine whether the employment of a new weapon would violate any existing international law. But above and beyond that, we must not overlook the ethical disconnect inherent in LAWS, which contrasts so sharply with the kind of human connection described by President Weizsäcker in his recollections. Although different in nature from nuclear weapons, any use of fully autonomous robotic weapons would have irreversible consequences for both the country using them and the country they are used against.

I therefore strongly urge all parties—those states already calling for a ban on LAWS, countries such as Japan that have declared their intention not to develop such weapons and NGOs committed to the Stop Killer Robots campaign—to come together to work for the early adoption of a legally binding instrument comprehensively prohibiting the development and use of these systems.

Previous    Next

─── other articles ───

our story

page top