Israel Utilizes Artificial Intelligence to Identify Bombing Targets in Gaza, Report Finds

Editor
By Editor

An investigation by Israel-based publications +972 Magazine and Local Call revealed that Israel’s military has been using artificial intelligence to choose its bombing targets in Gaza, resulting in thousands of civilian casualties. The system, named Lavender, was developed following attacks by Hamas and marked 37,000 Palestinians in Gaza as suspected “Hamas militants,” with some authorized for assassination. Although the military denied the existence of a kill list, they confirmed the use of AI tools in target identification but claimed they were only aids for analysts who were required to verify targets according to international law and IDF directives.

Interviews with Israeli intelligence officers revealed that they often did not conduct independent examinations of Lavender targets before bombing them, serving as more of a “rubber stamp” for the machine’s decisions. Lavender was built using data on known Hamas operatives, but also included loosely affiliated individuals, such as civil defense workers, in its dataset. The system was trained to identify features associated with Hamas operatives and rank Palestinians based on their similarity to known militants. This resulted in an accuracy rate of 90 percent, meaning that some of the targets identified were not actually members of Hamas.

Intelligence officers were given a wide margin for civilian casualties under Lavender, with authorization to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted. For senior Hamas officials, “hundreds” of collateral civilian casualties were authorized. Suspected Hamas operatives were targeted in their homes using the “Where’s Daddy?” system, often resulting in the deaths of entire families, even when the target was not at home. The system was driven by statistics, accepting mistakes in targeting based on the overall results.

Mona Shtaya, a fellow at the Tahrir Institute for Middle East Policy, explained that the Lavender system is part of Israel’s broader use of surveillance technologies on Palestinians in Gaza and the West Bank. This is particularly concerning in light of Israeli defense startups aiming to export their battle-tested technology. The Israeli military has utilized various technologies, including mass facial recognition programs and AI systems like “The Gospel,” to identify and target suspected Hamas operatives. The deployment of these technologies has contributed to a significant number of civilian casualties in Gaza.

The use of AI-driven warfare by Israel raises significant ethical and legal concerns, including potential violations of international law and human rights. The mass surveillance and killing of civilians in Gaza, under the guise of warfare, has been criticized as a continuation of collective punishment policies against Palestinians. The military’s reliance on advanced technologies for targeting and eliminating suspected militants has resulted in unintended consequences, such as the deaths of innocent civilians, highlighting the need for greater accountability and oversight in the use of AI in conflict zones.

Share This Article