The War to End All Wars: Israel's False Flag - Part II
"Like, this AI-based warfare allows people to escape accountability. It allows to generate targets, really, on a massive — you know, thousands...marked for potential assassination." Yuval Abraham
In an age when nations and individuals routinely exchange murder for murder, when the healing grace of authentic spirituality is usurped by the divisive politics of religious organizations, and when broken hearts bleed pain in darkness without the relief of compassion, the voice of an exceptional poet producing exceptional work is not something the world can afford to dismiss.
Aberjhani
(Pexels - Markus Winkler)
In Part I, I mentioned a Reuters report detailing Israel’s plan to establish a 20-30 foot buffer zone inside Gaza running the entire length and breadth of the country after the war’s conclusion.
Ophir Falk, who is a foreign policy adviser to Prime Minister Netanyahu, was quoted as saying: "The plan is more detailed than that. It’s based on a three-tier process for the day after Hamas." The article continued: "Outlining the Israeli government’s position, he said the three tiers involved destroying Hamas, demilitarizing Gaza and de-radicalizing the enclave. 'A buffer zone may be part of the de-militarisation process,' he said." Here's the article.
Given the incredible history of Gaza, we naturally wonder just how destruction and demilitarization of Hamas will occur. Evidently the high technological superiority which failed Israel on Oct. 7 is alive and kicking now.
Two Israeli publications, +972 and Local Call, have published articles by investigative journalist, Yuval Abraham , who developed confidential military informants for his investigation. He learned about two AI programs which have driven Israel’s bombing in Gaza.
The article from which I’m quoting was an interview with Amy Goodman from Democracy Now! He began:
Now, what sources told me is that after October 7th, the military basically made a decision that all of these tens of thousands of people are now people that could potentially be bombed inside their houses, meaning not only killing them but everybody who’s in the building — the children, the families. And they understood that in order to try to attempt to do that, they are going to have to rely on this AI machine called Lavender with very minimal human supervision.
Israel turned to an AI program called Lavender.
Now, what Lavender does is it scans information on probably 90% of the population of Gaza. So we’re talking about, you know, more than a million people. And it gives each individual a rating between one to 100, a rating that is an expression of the likelihood that the machine thinks, based on a list of small features — and we can get to that later — that that individual is a member of the Hamas or Islamic Jihad military wings. Sources told me that the military knew, because they checked — they took a random sampling and checked one by one — the military knew that approximately 10% of the people that the machine was marking to be killed were not Hamas militants.
Whoops! 10% were not Hamas! “Houston, we have a problem.”
They were not — some of them had a loose connection to Hamas. Others had completely no connection to Hamas. I mean, one source said how the machine would bring people who had the exact same name and nickname as a Hamas operative, or people who had similar communication profiles. Like, these could be civil defense workers, police officers in Gaza. And they implemented, again, minimal supervision on the machine. One source said that he spent 20 seconds per target before authorizing the bombing of the alleged low-ranking Hamas militant — often it also could have been a civilian — killing those people inside their houses.
And I think this, the reliance on artificial intelligence here to mark those targets, and basically the deadly way in which the officers spoke about how they were using the machine, could very well be part of the reason why in the first, you know, six weeks after October 7th, like one of the main characteristics of the policies that were in place were entire Palestinian families being wiped out inside their houses. I mean, if you look at U.N. statistics, more than 50% of the casualties, more than 6,000 people at that time, came from a smaller group of families. It’s an expression of, you know, the family unit being destroyed. And I think that machine and the way it was used led to that.
The interview becomes ever more interesting at this point. Not only did Lavender mark these individuals as Hamas, Lavender could tell them how many other people could be killed in the same house or area. Low-level leaders could have 15-20 other people killed with them. A high-level Hamas commander could include into the triple digits. Such numbers would dictate what munitions would be used. Low-level personnel would necessitate dumb bombs killing everyone within the structure. In other words, cheap bombs. You want the cost of killing in direct proportion or less than the ranking of your intended target. High-level munitions to execute a Hamas leader, like the ones killed in Iran’s Syrian Embassy, use sophisticated, smart bombs.
So, for example, Ayman Nofal, who was the Hamas commander of the Central Brigade, a source that took part in the strike against that person said that the military authorized to kill alongside that person 300 Palestinian civilians. And we’ve spoken at +972 and Local Call with Palestinians who were witnesses of that strike, and they speak about, you know, four quite large residential buildings being bombed on that day, you know, entire apartments filled with families being bombed and killed. And that source told me that this is not, you know, some mistake, like the amount of civilians, of this 300 civilians, it was known beforehand to the Israeli military. And sources described that to me, and they said that — I mean, one source said that during those weeks at the beginning, effectively, the principle of proportionality, as they call it under international law, quote, “did not exist.”
So, if you were in the same building where a high-level Hamas leader lived or was visiting at the time, you were killed along with 299 other people.
(Pexels - Ahmed Akacha)
Wait!! We haven’t gotten to “Where’s Daddy?” So…mass surveillance systems have a capability called “linking.” They link huge volumes of material including where someone lives.
And what sources told me is that since everybody in Gaza has a home, has a house — or at least that was the case in the past — the system was designed to be able to automatically link between individuals and houses. And in the majority of cases, these households that are linked to the individuals that Lavender is marking as low-ranking militants are not places where there is active military action taking place, according to sources. Yet the way the system was designed, and programs like Where’s Daddy?, which were designed to search for these low-ranking militants when they enter houses — specifically, it sends an alert to the intelligence officers when these AI-marked suspects enter their houses.
Remember the three cars filled with seven Chef Andrés World Central Kitchen aid workers targeted by three missiles? All were killed. Israel acknowledged wrongdoing. John Kirby acknowledged wrongdoing. Since each car received one missile, I guess IDF intelligence independently confirmed they were high-level Hamas.
Whoops! Maybe “Where’s Daddy?” got the wrong car ID.
Remember when the IDF fired on Gazans rushing to get food supplies?
Witnesses and medics said Israeli forces opened fire Thursday on thousands of Palestinians who had gathered in an open area of Gaza City hoping to receive food and other desperately needed humanitarian aid. CBS News.
Oy, vey. If only they had consulted Lavender and Where’s Daddy?
(Pixabay - nordstreamart)
Yuval’s final paragraph for me.
I mean, take the principle of distinction under international law. When you design a system that marks 37,000 people, and you check, and you know that 10% of them are actually not militants — right? — they’re loosely related to Hamas or they’re not related at all — and you still authorize to use that system without any meaningful supervision for weeks, I mean, isn’t that a breach of that principle? When you authorize to kill, you know, up to 15 or up to 20 civilians for targets that you consider, from a military point of view, not especially important, isn’t that a clear breach of the principle of proportionality? You know, and I don’t know, like, I think international law really is in a crisis right now. And I think these AI-based systems are making that crisis even worse. They are draining all of these terms from meaning.
“They are draining all of these terms from meaning.”
Here’s one meaning to consider. When one force targets another via AI-connected networks, does the human simply sanction the wholesale slaughter of human life which costs him little? Death has no meaning. Life has no meaning.
Where’s the human and what has he become?
Where are the families whose DNA is extinguished from the surface of this world?
Yuval Abraham said above about the initial bombing in Gaza: “I mean, if you look at U.N. statistics, more than 50% of the casualties, more than 6,000 people at that time, came from a smaller group of families.”
Never forget. After each of these wars, Israel sells the weapons, weapon systems, and latest technological advances to countries throughout the world. They tell them that they’ve been real-time tested.
You know how it goes.
Now the rowin' gambler he was very bored
He was tryin' to create a next world war
He found a promoter who nearly fell off the floor
He said I never engaged in this kind of thing before
But yes I think it can be very easily done
We'll just put some bleachers out in the sun
And have it on Highway 61
Bob Dylan - “Highway 61 Revisited”