Connect with us

World News

Two new recruits help Israel track down Hamas operatives in Gaza. They are both AI

blogaid.org

Published

on

2 New Recruits Help Israel Track Hamas Operatives In Gaza. They Are Both AI

While the Lavender AI identifies human targets, Gospel hits structures and buildings.

New Delhi:

Reports have emerged claiming that the Israeli military has used advanced artificial intelligence (AI) systems in its bombing of Gaza. These systems, called Lavender and Gospel, have played a central role in the IDF’s targeting strategy, sparking debate about the ethical and legal implications of their deployment.

What is lavender AI

Developed by Israel’s elite intelligence unit, Unit 8200, Lavender functions as an AI-powered database designed to identify potential targets linked to Hamas and Palestinian Islamic Jihad (PIJ). Lavender uses machine learning algorithms and processes massive amounts of data to identify individuals considered ‘junior’ militants within these armed groups.

According to reports According to Israeli-Palestinian publication +972 Magazine and Hebrew language store Local Call, Lavender initially identified as many as 37,000 Palestinian men associated with Hamas or PIJ. The use of AI to identify targets marks a significant change in the way Israel’s intelligence apparatus, Mossad and Shin Bet, functions, relying on more labor-intensive human decision-making.

Soldiers often made split-second decisions, taking just twenty seconds to determine whether to bomb these identified targets based on Lavender’s information, primarily to determine the gender of the target. Human soldiers often followed the machine’s information without question, despite the AI ​​program’s margin of error of up to 10 percent, meaning it could be incorrect up to 10 percent of the time. According to the report, the program often targeted individuals with minimal or no ties to Hamas.

What is the gospel AI

Gospel is another AI system that works by automatically generating goals based on AI recommendations. Unlike Lavender, which identifies human targets, The Gospel reportedly identifies structures and buildings as targets.

“This is a system that allows the use of automatic tools to produce targets at a rapid pace and works by improving accurate and high-quality intelligence material as per requirements. Using artificial intelligence, and through the rapid and automatic extraction of updated intelligence – it provides a recommendation to the researcher, with the aim that there will be a complete match between the machine’s recommendation and the identification made by a person carried out,” the IDF said in a statement.

The specific data sources fed into The Gospel remain confidential. However, experts suggest that AI-driven targeting systems typically analyze various data sets, including drone images, intercepted communications, surveillance data, and behavioral patterns of individuals and groups.

Ethical and legal concerns

The use of lavender and gospel in Israel’s bombing campaign represents a significant advance at the intersection of AI and modern warfare, while also raising ethical and legal concerns. While these technologies offer potential benefits in target identification and operational efficiency, their deployment raises moral and legal dilemmas.