As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel’s use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny. A report by the Israeli outlets +972 Magazine and Local Call earlier this month said that Israeli forces had relied heavily on two AI tools so far in the conflict — “Lavender” and “Where’s Daddy.”

While “Lavender” identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, “Where’s Daddy” tracks these targets and informs Israeli forces when they return home, per the report, which cites six Israeli intelligence officers who had used AI systems for operations in Gaza, including “Where’s Daddy?”

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one of the officers told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations,” they added.

    • Revan343@lemmy.ca
      link
      fedilink
      arrow-up
      13
      ·
      7 months ago

      Israel shot their own people on October 7th, they don’t care how many Israelis die, if it helps them wipe out Palestinians

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 months ago

    creates AI capable of tracking targets in real time

    uses it to only attack when targets are at home with family because screw humanity I guess

    • athos77@kbin.social
      link
      fedilink
      arrow-up
      19
      ·
      7 months ago

      See, if you blow up the entire family, then you don’t have to worry about the kids growing up to seek revenge - taps temple. /S

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 months ago

    What kind of Bond villain keeps naming their AI systems?

    First it was “the Gospel” and now “Where’s Daddy?”

    Did they hire Nathan for You as a consultant or something?

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    This is the best summary I could come up with:


    As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel’s use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny.

    “So target verification and other precautionary obligations required under international law are much harder to fulfill, implying more civilians will be misidentified and mistakenly killed,” she continued.

    President Joe Biden, despite having continued to send arms to Israel, warned the country at the time that it may lose international support due to its “indiscriminate bombing” of the Gaza Strip.

    “Every person who wore a Hamas uniform in the past year or two could be bombed with 20 [civilians killed as] collateral damage, even without special permission,” they added.

    “Analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” they added.

    Sarah Yager, the Washington Director at Human Rights Watch, told Politico that in terms of proportionality and Israel’s use of technology, "we just have no idea.


    The original article contains 919 words, the summary contains 175 words. Saved 81%. I’m a bot and I’m open source!