

Rest assured the only combat these men will see is via drone surveillance feeds from behind a desk.
I can’t help but worry though, that in addition to all the concerns about surveillance and privacy, Foundry/the megadatabase being created on all Americans could be used in some pretty horrific ways against individuals that speak out against our new leaders, or individuals that vaguely resemble somebody that spoke out against our elite technologists.
How Israel Uses AI in Gaza—And What It Might Mean for the Future of Warfare
A program known as “The Gospel” generates suggestions for buildings and structures militants may be operating in. “Lavender” is programmed to identify suspected members of Hamas and other armed groups for assassination, from commanders all the way down to foot soldiers. “Where’s Daddy?” reportedly follows their movements by tracking their phones in order to target them—often to their homes, where their presence is regarded as confirmation of their identity. The air strike that follows might kill everyone in the target’s family, if not everyone in the apartment building.
Abraham, whose report relies on conversations with six Israeli intelligence officers with first-hand experience in Gaza operations after Oct. 7, quoted targeting officers as saying they found themselves deferring to the Lavender program, despite knowing that it produces incorrect targeting suggestions in roughly 10% of cases.
Yep, “no reason,” and that giant database on all Americans Palantir and all these other tech companies “aren’t” helping build is nothing we should be concerned about.
And we’re “not going to war with Iran,” and Trump planned the military strikes against Iran, but also the U.S. wasn’t involved in any way.