A war of co-victims. How AI Guided Israel in Bombings in Gaza

On February 24, 2022, marking the second anniversary of the Moscow-launched invasion of Ukraine, the UN Human Rights Monitoring Mission in Ukraine (HRMMU) released a report with statistics on the so-called “human cost” of the war: 30,457 civilians killed, 10,582 killed and 19,875 injured. Among the civilian casualties of the conflict were 587 children killed and 1,298 injured.



Figures released by UNRWA (United Nations Relief and Works Agency for Palestine Refugees in the Near East) head of the UNRWA (United Nations Relief and Works Agency for Palestine Refugees in the Philippines) in mid-March indicated 12,300 children had been killed in Gaza in just five months. More children have died in Israel's siege in the last four years than in all the wars in the world combined.



The HRMMU document notes that the number of civilian casualties in Ukraine “was high in the first months after large-scale armed attacks, with thousands of civilians killed and wounded per month”, decreasing in 2022 and 2023. An average of 163 civilians were killed and 547 injured in 2023. The total number of overnight bombings in the occupied Palestinian territory by Israeli forces.



Given these numbers, the question remains: IDF (Israel Defense Forces – Israel Defense Forces in its original acronym) lack of control or intent? The answer may lie partly in AI (artificial intelligence) technology, the lavender system used by the Israelis to plan attacks against Hamas members. Generally, the system analyzes the situation and calculates the likelihood of attacks with the aim of eliminating the militants, assuming an acceptable number of civilian casualties. This fluctuating equation turns on the green light of an attack or not with a pre-estimated number of co-victims, according to higher resolutions.



An inquiry into that Guardian Arab-Israeli journalist Yuval Abraham – had access to the magazine +972 And this Local callAn independent news website based in Tel Aviv – it collected the testimony of six Israeli military intelligence officers involved in identifying Hamas and Islamic Jihad targets in this attack against the Gaza Strip using artificial intelligence (AI) technology.

See also  Satellite images show rows of cars from Russia to Georgia - Observer



According to these officials, AI resources, especially the Lavender system, played an important role in the course of the IDF offensive, showing great ability to quickly identify potential fighters in the region and make them targets for shoot down. Lavender reportedly had 37,000 potential targets of Palestinians affiliated with Hamas or Islamic Jihad.



We are not faced with one here Method of action Completely new, Israeli forces admit to using AI as a weapon of war by 2021. That year's 11-day war in the Gaza Strip was said at the time to be the world's first “AI war.” At risk is already the decision-making time and the limited number of people to carry out combat tasks remotely: “What used to take hours now takes minutes, and human analysis only takes a few minutes”, explained an officer who headed the Tsahal's division of digital transformation.



Several officials pointed out that the system was designed for full-scale war, three years after the current conflict in the Gaza Strip.

We believe in algorithmic coolness

According to officials interviewed by Yuval Abraham, there was a kind of green light to kill in the first weeks – even the first months – of the attack on the Gaza Strip, which resulted in high numbers of civilian casualties.

But by resorting to a methodological analysis, Israeli forces may be trampling the moral and legal premises emanating from international organizations to (potentially) regulate armed conflict. It was, in a sense, a transfer of responsibility from man to machine, which, as one of Yuval Abraham's sources put it, “made everything easier”.



“It's unparalleled, never seen before,” said one of the officers who used lavender to explain that the teams had more faith in a “statistical mechanism” than a soldier who lost one in a Hamas attack: “Everybody there, and I, lost people on October 7. The machine decided coldly. . And it made it easier.”

See also  War in Ukraine. Joe Biden says he fears Putin has no exit strategy - the Observer

Another source who used the AI ​​program even wondered if the effectiveness of humans was significant in the Hamas targeting process, saying that Hamas “invested 20 seconds on each target at this point, and dozens of these procedures every day. I had zero value to add as a human except as a stamp of approval. It's a lot.” Time saved.”

We believe in algorithmic coolness This may be the slogan of the IDF at this time. Lavender, according to testimony, assumed greater decision-making power than the commanders in charge of operations. And this Stupid bombsUnguided munitions were used to destroy the target and everything around it, doing the rest, leveling entire settlements and killing all the residents.

One of the officers Yuval Abraham interviewed explained that if the suspect is a potential grassroots fighter from Hamas or Jihad, the preference is to strike when they are believed to be at home: “We are not interested in killing. Only when combatants are in military positions or engaged in military action. Bombing a family home is easy. The system is built to look for them in such situations.

Conclusion First the machine, then the man

So the initial decision machine will be left in the hands of Lavender, an organization created by Unit 8200, the elite intelligence unit of the IDF, the equivalent of the National Security Agency in the United States. The green light to launch an attack depends on pre-established criteria for an acceptable number of network victims, which, as the numbers show, is broadband.



This acceptable number of civilian casualties varied during the six-month war. According to two investigative sources, attacks on Hamas or Islamic Jihad grassroots fighters were allowed to kill 15 to 20 civilians in the first weeks of the conflict. Attacks that used the aforementioned “dummy bombs”. The priority at the time was to speed up the bombing process, which meant getting a machine that could produce massive numbers of targets.

See also  "It looks like we're not at war. I forgot even for a moment



And lavender responded to this new priority completely, step by step GuardianThese pre-war procedures went through a bureaucratic process that involved deliberations before a decision was made by a legal adviser.




After October 7, often driven by a desire for revenge – one of the investigative sources said – the pattern of authorizing attacks on human targets accelerated dramatically.Commanders demand a steady stream of potential Hamas and Jihad profiles.



“We were constantly under pressure: 'Bring us more goals.' They even shouted at us,” said one official. “They told us: We must overthrow Hamas, at any cost, if you can, bomb it.”



In this new environment, forces on the ground began to rely increasingly on the Lavender system to develop targets linked to Hamas. Believing that the algorithm achieved 90 percent accuracy in the analysis of suspect profiles, the Army recommended its use on the battlefield.



This option, according to one of the sources interviewed by the inquiry, would free the Israeli military to absolve themselves of any guilt over a wrong decision by placing potential valuation errors in the machine's price list. A war justification of apparently uncontrolled attacks – which were directed at precise targets and in a precise manner – and with such large numbers of civilian casualties.



When identifying low-ranking targets, he explained, “When it comes to a junior fighter, we're not going to invest manpower or waste our time. [Em plena guerra, não temos tempo para] Charge all targets carefully. So we're willing to use the margin of error with AI, risk collateral damage and civilian deaths, face the risk of hitting by mistake, and live with it.

Leave a Reply

Your email address will not be published. Required fields are marked *