How Israel used the AI ​​tool 'Lavender' during the 2021 Gaza attack

Machine Learning


Among recent reports that the Israeli military is using an artificial intelligence-powered tool called Lavender. Identify bombing targets in the Gaza Stripa year-old video surfaced on social media in which officials talk about how the government is using machine learning to identify targets.

The Israel Defense Forces (IDF) denies that AI is used to identify terrorist suspects, but Israeli cyber intelligence officials have detailed how it was done. Machine learning techniques were used The Guardian reported on the situation during the 2021 attack on Gaza.

Citing the example of “one of the tools,” the official named “Colonel Yoav” said, “Let's say there are some terrorists who form a group and we only know about some of them. …by putting the magic of data science into practice.” You can use the powder to find the rest. ”

This video was taken at a conference at Tel Aviv University in February 2023. Interestingly, the gathering was instructed not to take photos of officials or record presentations.

The official, a member of Unit 8200, said the unit used machine learning to spot Hamas squad missile commanders and anti-tank missile terrorists in Gaza during an IDF military operation in May 2021. .

“We take the original subgroups, calculate their proximity circles, then calculate the relevant features, and finally rank the results to determine the threshold,” the Guardian told the intelligence agency. Quoted from the person.

The colonel said feedback from intelligence personnel was used to enhance and improve the algorithm. But he stressed that “real people” make decisions. “These tools are intended to help break down human barriers,” the Israeli official further said.

The intelligence officer said his unit had managed to generate more than 200 new targets. “Applied data science-driven solutions will allow us to react suddenly during combat,” he said of the benefits of AI tools.

The colonel's account is similar to recent revelations by six Israeli intelligence officials to +972 Magazine and Hebrew media.

Six IDF officials say an AI-based tool called “Lavender” with a 10% error rate was used to assist intelligence agents involved in the Gaza bombing campaign to identify tens of thousands of potential human targets. Said it was used.

But the IDF said some of the explanations were “baseless.” The IDF denied that AI is being used to identify terrorist suspects, although it did not dispute the existence of the tool.

Issuer:

Abhishek De

date of issue:

April 12, 2024



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *