Lavender & Where's Daddy: How Israel Used AI to Form Kill Lists & Bomb Palestinians in Their Homes

Seg1 gaza bombing

The Israeli publications +972 and Local Call have exposed how the Israeli military used an artificial intelligence program known as Lavender to develop a “kill list” in Gaza that includes as many as 37,000 Palestinians who were targeted for assassination with little human oversight. A second AI system known as “Where’s Daddy?” tracked Palestinians on the kill list and was purposely designed to help Israel target individuals when they were at home at night with their families. The targeting systems, combined with an “extremely permissive” bombing policy in the Israeli military, led to “entire Palestinian families being wiped out inside their houses,” says Yuval Abraham, an Israeli journalist who broke the story after speaking with members of the Israeli military who were “shocked by committing atrocities.” Abraham previously exposed Israel for using an AI system called “The Gospel” to intentionally destroy civilian infrastructure in Gaza, including apartment complexes, universities and banks, in an effort to exert “civil pressure” on Hamas. These artificial intelligence military systems are “a danger to humanity,” says Abraham. “AI-based warfare allows people to escape accountability.”

Full article on the Democracy Now website at

Story imported via RSS from
RSS Article Source:

Democracy Now! produces a daily, global, independent news hour hosted by award-winning journalists Amy Goodman and Juan González. Our reporting includes breaking daily news headlines and in-depth interviews with people on the front lines of the world’s most pressing issues. On Democracy Now!, you’ll hear a diversity of voices speaking for themselves, providing a unique and sometimes provocative perspective on global events. Support Democracy Now:

Leave a Reply