[0:00]Israel has been using an AI system called Lavender that marked at least 37,000 Palestinians as suspected terrorists.
[0:09]and to compile part of its kill list during its ongoing military offensive in Palestine's Gaza. According to an investigative report by the +972 Magazine and Local Call, testimonies by Israeli military officials revealed how Lavender was used to assassinate individuals in Gaza with almost no human oversight to verify the targets identified by the AI. Developed by the Israeli military's elite intelligence division, Unit 8200, the AI system analyses data obtained through the mass surveillance of most of Gaza's 2.3 million residents to identify potential targets. The system ranks every individual in Gaza on a scale from 1 to 100, according to the probability that they are involved with the military wing of Hamas and Palestinian Islamic Jihad. Some of the targets mistakenly flagged by the system included relatives of Hamas operatives, those with social media connections to Hamas members, and even people who happen to have the same names as Hamas operatives. Despite the Israeli military's assessment that some of the individuals identified by the AI may not be affiliated with the Palestinian resistance groups, the 'kill list' was seen as definitive. The only check made by humans was to verify that the target was male. According to the report, the system has played a key role in Israel's massive bombardment of Palestinian territory, especially during the early stages of Tel Aviv's offensive. The report reveals that it was during those initial weeks that an Israeli military command allowed the killing of up to 20 civilians whenever the system identified what it assessed to be a junior Hamas member. In the case of senior Hamas operatives, the army authorized the killing of over 100 civilians while assassinating just one target. In other cases, if the potential target gave their phone to a male relative, that person would then become the target and would be bombed along with his whole family. Additionally, a home tracking system called 'Where's Daddy?' was used to boost targets when the number of assassinations was waning and was specifically used to carry out bombings when the targets had entered their family homes. Low-ranking Israeli officers were the ones making decisions about who lives and dies, according to the report. In addition to 'Lavender' and 'Where's Daddy?', it was revealed that the Israeli military had been employing another AI system in its war on Gaza, using 'The Gospel' to identify buildings where 'suspects' were 'operating' before bombing them. According to the report, the staggering death toll in Gaza is a result of Israel's weaponization of these AI programs, which experts say gives Tel Aviv a free hand to commit 'genocidal acts with impunity'. In its all-out war on Gaza, which is now in its sixth month, Israel has killed over 32,000 Palestinians, mostly women and children.



