The Israeli army reportedly utilizes an AI system, 'Lavender,' to compile a 'kill list' of alleged Hamas terrorists. Six Israeli intelligence officers admitted to categorizing thousands of Palestinians as suspected militants, marking them as potential targets for airstrikes. Despite the allegations, Israel has denied the AI's involvement, stating it serves as auxiliary support for officers in their investigative processes.
AI Program 'Lavender' Allegedly Dictates Israeli Military Targeting in Gaza Conflict
An investigation conducted by +972 Magazine and Local Call has brought to light the existence of an AI program within the Israeli military dubbed "Lavender". Operating during the ongoing conflict, Lavender reportedly holds considerable sway over targeting decisions, with military personnel treating its outputs almost akin to human choices.
Targeting potential members of Hamas and Palestinian Islamic Jihad, Lavender marks thousands of Palestinians and their residences as possible airstrike targets.
Trained on extensive Israeli surveillance data spanning decades, Lavender assesses individuals based on their probability of being militants, utilizing the digital traces left by known militants as benchmarks.
Despite its documented 10% margin of error, Lavender's determinations, frequently endorsed in mere seconds by human officers, guide airstrikes on residences housing suspected militants and their families. This tactic prioritizes attacks on individuals within their homes, usually at night, facilitated by auxiliary automated systems such as "Where's Daddy?".
Reports from Gaza suggest a death toll of over 30,000 Palestinians, with Israeli insiders confirming thousands killed in airstrikes chosen by AI. Lavender, an AI system, was allegedly allowed to draft official kill lists just two weeks into the conflict in October 2022, with IDF guidelines permitting up to 100 bystander casualties in strikes targeting alleged Hamas commanders.
Despite IDF's denial of AI involvement in identifying suspected terrorists, Israeli intelligence sources corroborate Lavender's role in selecting targets for airstrikes. Human operators reportedly provided brief approval to Lavender's decisions, spending around 20 seconds per target before authorizing airstrikes.
These revelations come amid heightened international scrutiny following targeted airstrikes that resulted in the deaths of foreign aid workers in Gaza and a growing humanitarian crisis.
READ ALSO: Artificial Intelligence (AI) Models With LLM-Based Agents Always Choose War Over Peace [Study]
AI's Impact on Warfare and Decision-Making
The discourse on advanced artificial intelligence has predominantly revolved around the potential displacement of white-collar professionals, akin to the impact robotics once had on blue-collar workers. However, its effects are expected to extend beyond fields such as law, accounting, and journalism.
The emergence of generative pre-trained transformers (GPT), an advanced form of AI, is poised to revolutionize global military strategies and deterrence mechanisms, albeit with potentially unsettling consequences.
This technology is expected to bring about a significant transformation, particularly in the realm of warfare, by expanding the utilization of AI-driven drones across various branches of the military, including air forces, navies, and armies.
Conversely, there are apprehensions regarding the hastened decision-making pace fueled by AI, potentially reducing strategic and tactical windows to minutes rather than hours or days, with far-reaching implications, even in nuclear warfare scenarios.
As AI permeates command and control processes, there's a risk of overreliance on automated systems, potentially compromising human oversight and leading to unintended consequences, as envisioned in scenarios of accidental nuclear conflict precipitated by AI-driven intelligence.
RELATED ARTICLE: Artificial Intelligence Empowers Ukraine's Cyber-Operators in High-Tech Battle Against Russia on the Front Lines
Check out more news and information on Artificial Intelligence in Science Times.