London: A quadcopter drone equipped with a kill list drawn from a vast database, a facial recognition camera to track targets, and a machine gun. The Israeli military’s artificial intelligence-powered systems played a central role in the Gaza war.
The ruthless efficiency of AI programs that process data to create bombing targets, combined with reports of limited human oversight, are blamed in part for the very high civilian casualty toll.
The scale at which automation and machine learning have been used and developed during wars has led military experts to conclude that the world is now at a tipping point in how future wars will be fought.
For Palestinians, the legacy of this AI-driven conflict goes beyond the direct traces of death and destruction. It is very likely that these technologies will be returned to occupation and entrench what many are calling “automated apartheid.”
Israel has long been accused of using the occupied Palestinian territories as a laboratory to develop advanced military and surveillance technology.
It has spent years collecting vast amounts of information and data from Gaza and the occupied West Bank. Nevertheless, in previous wars with Palestinian militants in Gaza in 2014 and 2021, the Israeli Air Force ran out of actual targets to attack.
“They’re hitting everything they had or could identify during the war, and they’re going to run out of targets,” Noah Sylvia, a research analyst at the UK-based Royal United Services Institute, told Arab News.
“So they created a database of tens of thousands of targets that they could attack if necessary, in case another war broke out.”
A book published in 2021 by Yossi Sariel, then the commander of Israel’s elite cyber warfare agency, Unit 8200, offered chilling insights into the role AI could play in the creation of such target banks.
“Imagine 80,000 relevant targets created before a battle and 1,500 new targets created every day during a war,” he says in “Human-Machine Teams,” which describes how AI can change the way wars are fought.
Humans, he wrote, have become the “bottleneck” that prevents the creation and approval of those goals. “A team of machines and investigators can widen bottlenecks.”
This vision became a reality when Israel launched a military operation in Gaza in retaliation for the Hamas-led attack on October 7, 2023.
During the first week of the war, the Israeli military reported dropping 1,000 bombs daily. By early December 2023, it had reported 10,000 airstrikes.
Studies have estimated that the first months of the conflict were one of the most intensive bombing campaigns in history, with levels of destruction comparable to the bombing of Dresden, Hamburg and Cologne, three German cities during World War II.
Details of the AI-powered system powering the campaign began to emerge in a report by Israeli magazine +972.
An investigation published in November 2023 revealed that the Israeli military was using a system called “The Gospel” to select buildings as targets much earlier than previously.
One former Israeli intelligence officer described the system as a “mass assassination factory.”
Five months later, +972 revealed the existence of a “Lavender” program that selected people rather than structures as targets. The system made even the lowest-ranking members of Hamas and Islamic Jihad targets for air force bombs.
Sources told the magazine that the system selected 37,000 suspected militants and their homes for airstrikes early in the conflict.
This article revealed another system named “Where’s Daddy?” It can simultaneously track thousands of individuals flagged by Lavender and send a signal when they arrive at their parents’ home.
The report said the military preferred to bomb their homes at night, usually in the presence of their families, because it was easier to find them.
Sources also told +972 that in the early weeks of the war, the military decided it was acceptable to kill 15 to 20 civilians for each junior Hamas militant, and in some cases 100 civilians for senior commanders.
It also revealed alarming details about the level of human surveillance of targets provided by AI programs known as decision support systems.
Sources said it was a “rubber stamp” approach to targets flagged by the system, with only 20 seconds spent on each target before being cleared for bombing.
“That would make meaningful, substantive human input in the target selection process nearly impossible,” Asaf Rubin, an associate professor at Indiana University Maurer School of Law and a former Israeli intelligence analyst, told Arab News.
Israel insists that the AI system it uses is simply a tool to identify targets, and that all targets are independently verified by intelligence analysts for the legitimacy of the attack.
However, Rubin suggests that given the amount of targets generated, it would have been impossible for humans to perform proper verification or challenge the information.
“Our entire legal framework in international humanitarian law and the laws of war is based on the understanding that for those involved in targeting decisions, there is a process, an iterative process,” he said.
“Automated bias and technology has played a role in mitigating that process, a repetitive process, so that we can do more review, more scrutiny, more analysis, more questioning.”
Anwar Mazin, an associate professor of political science at Stonehill College in Massachusetts, said the AI system gave the Israeli military “fake confidence” in its ability to select targets.
She said this led to “confirmation bias” based on the Israeli military’s existing views on Palestinians in the Gaza Strip.
“If the person approving the goal has their own biases, it’s easy to say, ‘Well, the data tells me so, so it’s not my fault.'”
Mahazineh added: “The way the weapons were used, all these weapons, all the systems, helped facilitate the genocide in Gaza.”
Experts agree that although AI played a role, the actions of Israel and its military are ultimately responsible for the huge number of civilian deaths.
“There is no question that AI systems were used to generate kill and target lists in a way that has not been seen before in conflicts between Israelis and Palestinians,” Rubin said.
But he added that the scale of the civilian death toll “could also be related to decisions that have nothing to do with technology, decisions that have everything to do with policy changes and how the military has conducted itself in this operation and since October 7.”
Israel has so much information about Gaza that it “always knows exactly what it is doing,” RUSI’s Sylvia said. “They know how many casualties there will be.
“What artificial intelligence does is take existing operating procedures and enable them to run faster and at scale.
“How ethically we use artificial intelligence is largely correlated with existing operating procedures.”
Another legacy of the Gaza war is the involvement of major international technology companies in Israel’s military AI systems.
Microsoft announced in September that it had cut off some services to Unit 8200 following media reports that the Azure cloud was being used to store intercepted phone calls by ordinary Palestinians.
These ordinary Palestinians will live in the shadow of new AI-powered military and surveillance technologies developed by Israel during the conflict.
“Automated apartheid is not only becoming more likely, it is accelerating,” Jalal Abu Khater, policy manager at 7amleh, a nonprofit organization promoting digital rights for Palestinians, told Arab News.
“Wartime sophisticated AI systems, including predictive analytics, biometric monitoring, automated targeting, and mass data extraction, will entrench and expand regimes of occupation and apartheid.
“These tools do not disappear after a conflict; they become part of everyday governance.”
Mazineh, a Palestinian citizen in Israel, said the use of AI is entrenching the occupation and making it “increasingly sophisticated.”
“The main reason Israel is able to produce all this is because of its large-scale occupation system,” she says.
Advances in AI military systems are raising concerns that the future will be dominated by killer robots such as autonomous drones and vehicles that will decide who lives and who dies.
The Gaza war showed that automated systems are already in place that work in the background to determine who or what is a target.
If the same recklessness manifested in Gaza were to be carried out, the consequences would be devastating, with Gaza health authorities saying nearly 70,000 Palestinians were killed in two years.
At a time when international legal norms and rules are under strain due to the rapid development of automated warfare, Rubin believes Gaza is a “pivotal moment” that will “redefine wartime operations.”
“It’s not just about the targeting cycle. It’s about every aspect of the war machine,” he said.
“We are entering very dangerous times.”

