Lavender: Israel offers a glimpse into the terrifying world of military AI.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

You are reading an excerpt from today’s Worldview newsletter. Sign up to get the rest for free.Including news from around the world and interesting ideas and opinions to know, delivered to your inbox every weekday.

It’s hard to coin a more airy sobriquet. A new report published by Israel has reportedly used an AI-powered database to target Hamas and other militants in the besieged Gaza Strip, according to +972 magazine and Local Call. According to the report, the tool, trained by Israeli military data scientists, sifted through a vast trove of surveillance data and other information to generate assassination targets. It may have played a particularly important role in the early stages of the current war, as Israel carried out constant airstrikes on the area, flattening homes and entire neighborhoods. So far, according to the Gaza Health Ministry, more than 33,000 Palestinians, the majority of them women and children, have been killed in the area.

AI tool name? “Lavender.”

This week, Israeli journalist and filmmaker Yuval Abraham published a lengthy exposé about the existence of the Lavender program and its implementation in the Israeli campaign in Gaza that followed the deadly October 7 Hamas terrorist attack on southern Israel. Ibrahim’s reporting — which appeared in the left-leaning Israeli English-language website +972 magazine, and its sister Hebrew-language publication Local Call — drew on the testimony of six anonymous Israeli intelligence officers, all of whom Served during the war. There was “first-hand involvement” with the use of AI to select targets for elimination. According to Abraham, Lavender identified 37,000 Palestinians and their homes for murder. (The IDF denied to the reporter that such a “kill list” existed, describing the program merely as a database intended to cross-reference intelligence sources. .) White House national security spokesman John Kirby told CNN Thursday that the U.S. is looking into In media reports on the apparent AI tool.

“During the early stages of the war, the military gave officers wide discretion to adopt Lavender’s kill lists, with no need to explain why the machine made that choice or check the raw intelligence data on which it was based. were”. .

“Human personnel often act only as ‘rubber stamps’ for machine decisions,” said one source, adding that, typically, they personally inspect each target before giving permission to detonate. “Dedicating only ’20 seconds’ to – just to make sure the lavender-marked target is male,” he added. “This was despite knowing that the system makes ‘errors’ in about 10 percent of cases, and has been known to occasionally flag individuals with only loose links to militant groups, or no links at all. “

This may help explain the scale of destruction Israel has unleashed in Gaza as it seeks to punish Hamas, as well as the high death toll. The early days of the Israel-Hamas conflict saw the Israel Defense Forces think of a long, humane process for selecting targets based on intelligence and other data. In a moment of intense Israeli anger and shock following the October 7 attack by Hamas, Lavender could have helped Israeli commanders come up with a swift, elaborate program of retaliation.

We were constantly under pressure: ‘Get us more goals.’ They actually yelled at us,” said an intelligence officer, in testimony published by Britain’s Guardian newspaper, which first accessed the accounts exposed by +972.

The munitions that Israel allegedly dropped on targets selected by Lavender were “dumb” bombs – heavy, unguided weapons that caused mass damage and civilian casualties. According to Ibrahim’s reporting, Israeli officials did not want to “waste” more expensive precision-guided munitions on many of the junior-level Hamas “activists” identified by the program. And they showed little qualms about dropping those bombs on buildings where the target’s families slept.

“We were not interested in killing. [Hamas] operatives only when they are in a military building or engaged in a military activity,” A, an intelligence officer told +972 and local call. “In contrast, the IDF has used no Bombarded their houses out of hesitation. It is very easy to bomb a family home. The system is designed to look for them in these situations.

Throughout the war there have been widespread concerns about Israel’s targeting strategies and methods.. “It is difficult to distinguish between legitimate military targets and civilians under the best of circumstances,” Brian Kastner, senior crisis adviser and weapons investigator at Amnesty International, told colleagues in December. “And so under the basic principles of discretion, the Israeli military must use the most accurate weapon it has available and the most appropriate weapon for the target.

In response to Lavender’s revelations, the IDF said in a statement that some of Ibrahim’s reporting was “baseless” and disputed the characteristics of the AI ​​program. In a published response, the IDF wrote that “it is not a system, but merely a database intended to refer to intelligence sources, in order to obtain the latest layers of information about military operatives of terrorist organizations.” can be presented.” guardian

“The IDF does not use artificial intelligence systems that identify terrorist operatives or attempt to predict whether a person is a terrorist,” he added. “Information systems are merely tools for analysts in the target identification process.”

This week’s incident involving an Israeli drone strike on a convoy of vehicles belonging to the World Central Kitchen, a major food aid group, killing seven of its workers, has raised eyebrows over Israel’s conduct of the war. Illuminated. In a phone call with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden reportedly called on Israel to change course and take significant steps to better protect civilian lives and enable the flow of aid.

In addition, hundreds of prominent British lawyers and judges submitted a letter to their government, urging it to suspend arms sales to Israel to avoid “engaging in serious violations of international law”.

The use of AI technology is still only a small part of what has troubled human rights activists about Israel’s conduct in Gaza. But it points to a bleak future. lavender, Adil Haq observed.“Every international human rights lawyer’s dream come true,” is an international law expert at Rutgers University.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment