The Israel Defense Forces today denied that it used artificial intelligence systems to create kill lists for various bombing operations in Gaza.
In a frankly dystopian investigative report, Israeli magazine +972, which spoke to six Israeli intelligence officials on the condition of anonymity, claimed that attacks on targets in the Gaza Strip were sometimes carried out by an AI system called “Lavender.”
The officers said they relied on data generated by the software to engage in bombing of Palestinians, and one anonymous source said they treated the output as if it were a “human decision.” This is all the more shocking following the news that the Israel Defense Forces recently killed seven aid workers, adding to the toll of innocent people killed in bombing operations over Gaza.
“Hundreds of [of targets] They break into the system and wait to see who they can kill,” one of the sources told +972. “This is called broadhunting – they copy and paste from lists that the targeting system generates.” The officers were reportedly not asked to “examine the underlying raw intelligence data.”
The kill list presented by this system reports that approximately 37,000 people lost their lives to IDF bombings. Another officer said the IDF relied “almost completely” on Lavender, even though it knew the system wasn't always reliable. He explained that a human must “rubber stamp” the results before acting on the information, but said the decision process typically takes about “20 seconds.” He added, “I didn't have any added value as a person. It saved me a lot of time.''
The system works by identifying the 2.3 million Palestinians living in the Gaza Strip. It ranks each person based on the data and each person in the system is given a score between 1 and 100 on the likelihood of belonging to Hamas. If the person is classified as a significant threat, potentially a Hamas commander, they would be given the authority to kill up to 100 civilians in an attack, the officer said.
“We rescued thousands of people,” another officer explained. “Instead of examining them one by one, we input everything into an automated system and [ranked individuals] Being at home, he became an immediate target. We bombed him and his house. ”
The Guardian confirmed the use of the software after speaking with officials, one of whom said Lavender had killed between 15 and 20 civilians despite giving them a lower rank. He said he was authorized to kill. “You don't want to waste expensive bombs on unimportant people. It's very expensive for the country and it's in short supply,” he said.
The IDF responded to the reports in a statement to The Guardian, explaining that they do, in fact, use lavender, but to “cross-reference sources to provide up-to-date information on terrorist military operatives.”
The statement added: “The IDF does not use artificial intelligence systems to identify terrorist operatives or predict whether a person is a terrorist. In accordance with the provisions of international law, the assessment of the proportionality of an attack is made by the commander based on all available information before the attack.”
Photo: Emad El Baid/Unsplash
Your upvote matters to us and helps keep our content free.
With just one click below you can support our mission of providing free, rich, relevant content.
Join the YouTube community
Join a community of 15,000+ #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies Founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many other notables and experts.
thank you