Apologies.
Politics and religion were not intended to be discussed here. There are plenty of other better-written and funded places to find that information.
I felt compelled in this instance.
A.I. in the battlefield.
The Israeli Defense Forces (IDF) utilizes “Lavender,” an A.I. capability used to select the most likely or “best” locations to target Hamas militants. I would feel like a hypocrite debating the advantages and shortcomings of this technology (A.I.) without acknowledging news that is staring me in the face. Though it falls outside the advertising parameter, here we go without commentary one comment.
“Lavender”
This week from the Washington Post, Israeli journalist and filmmaker Yuval Abraham published a lengthy expose on the existence of the Lavender program and its implementation in the Israeli campaign in Gaza that followed Hamas’s deadly Oct. 7 terrorist strike on southern Israel.
Per the Washington Post:
Abraham’s reporting drew on the testimony of six anonymous Israeli intelligence officers, all of whom served during the war and had “first-hand involvement” with the use of AI to select targets for elimination. According to Abraham, Lavender identified as many as 37,000 Palestinians — and their homes — for assassination. (The IDF denied to the reporter that such a “kill list” exists, and characterized the program as merely a database meant for cross-referencing intelligence sources).
Abraham’s reporting — which appeared in +972 magazine, a left-leaning Israeli English-language website, and Local Call, its sister Hebrew-language publication.
The Washington Post continues:
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
Okay - here is my one comment. Probably three careers ago, I led teams designing flight simulators for the U.S. Navy. Seriously. We had a term: “Man-in-the-Loop” that was part of any weapons fire control system. It is surprising that this was bypassed with A.I.; especially since any A.I. LLM (Large Language Model), or LMM, or SLM, is always learning. While it learns and improves over time, this fact alone foreshadows degraded initial capabilities. This might be an understatement.

“The Gospel”
Lavender is not alone as part of the IDF A.I. arsenal of smart tools used in warfare. Per +972 magazine:
The Lavender machine joins another AI system, “The Gospel” (also know by the name “Habsora”) about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications (you can select “English” or “Hebrew” at the top center of the page). A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.
+972 magazine continues regarding some of the capabilities of The Gospel:
Several of the sources, who spoke to +972 and Local Call on the condition of anonymity, confirmed that the Israeli army has files on the vast majority of potential targets in Gaza — including homes — which stipulate the number of civilians who are likely to be killed in an attack on a particular target. This number is calculated and known in advance to the army’s intelligence units, who also know shortly before carrying out an attack roughly how many civilians are certain to be killed.
For more information on “Lavender” and “The Gospel,” here are the links. There is plenty more to read.