Digital tools and military technology– Reflections on AI amidst genocide

Digital tools and military technology– Reflections on AI amidst genocide

In the past few years, conversations about data ethics have taken up a place of central importance in the fields of digital humanities and geography. We’ve become increasingly aware of the delicateness and care with which academics have to employ digital tools. As a geographer, I’m concerned with the ways that people, academics included, make and remake the world around us. Digital technologies allow us to deepen and expand access to knowledge production and to revolutionize our world-making practices. But the possibilities for our world-making are conditioned by the tools we use to build it. 

The uneasiness I feel about using emerging technologies in academia in pursuit of expanding access to knowledge production has become sickeningly clear in the past seven months, as I piece together the reality that many of these technologies were not developed to expand liberatory world-making practices. Instead, the innovation of these technologies has been animated by collaborative interests working in concert to develop the power to annihilate–in this case, to annihilate Palestine. 

Digital tools that are similar to, if not the same as, the ones we use in our research and the classroom are creating the conditions for the ongoing genocide of Palestinians in Gaza by the Israeli military. In the past seven months alone, Israeli forces have killed over 33,000 Palestinians and displaced over 1.4 million. On average, Israel is killing over 250 Palestinians a day, far surpassing the death rates of any other major conflict in the 21st century–compare to the average rate of 50 Iraqi deaths per day at the hands of the US military. How is this even possible? Tech makes it happen. The collaboration between the tech industry and the Israeli military is unparalleled–the Israeli tech industry is the second largest in the world, just behind Silicon Valley, but its overt R&D focus on warfare is unlike anything else. The IDF actively uses Gaza and the killing of Palestinians as a testing ground for new technological warfare before cybersecurity companies take their killing machines to the world market.   

When Israel bombs hospital after hospital, claiming afterward that Hamas operatives were identified in the building, they used AI to make those identifications for them. The military employs an AI technology called Lavender to identify human targets. Lavender has generated a list of 37,000 Palestinians who were marked as suspected Hamas fighters, and assassination approval is automatic for people on this list–no fact-checking or confirmation is made by any human person before the Palestinian suspect is targeted and killed. The machine generated this list of 37,000 by giving every single person in Gaza a rating from 1-100 on how likely they are to be working with Hamas. The rate of error for this AI is 10%. 

From this list generated by AI, Israel uses two more AI programs: one called “Where’s Daddy?” and another called “Gospel.” These programs track suspected Palestinians targeted for assassination. This software tracks the locations of suspects and sends an alert to targeting officers when the suspect has arrived at home so that officers can mark the home for automatic bombing. Israeli military policy allows the killing of 15-20 non-targeted Palestinians to be considered “collateral damage,” so targeting officials are free to employ intel from “Where’s Daddy?” when families are at home. The lag-time between the “Where’s Daddy” alert and the actual bombing often means that families are bombed in their homes when the suspect isn’t actually even there. With the lack of human oversight, error rates, and lag-times, it’s clear that the Israeli military and tech companies are not developing AI to mitigate the death and destruction wrought by warfare–they aim to make it automatic and all-encompassing. With this technology there is no use for precision killing, there is only mass assassination, only genocide. 

Militaries both abroad and in the US fund, develop, and test the digital technology with warfare, and then the tools trickle down to us. Sometimes for better, sometimes for worse, digital tools deepen and extend our connections to one another across space and time. As researchers and educators, we have to think hard about this connectedness and imagine ways we can build new tools, or reappropriate the ones we already have. I don’t have a solution to the difficult feelings of complicity that come up for me in light of all of this. But I think it’s important to carry that feeling–to remain painfully aware of the horrors that make the technology we use possible. 

To learn more about military technology testing in occupied Palestine, read The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World by Antony Loewenstein. 

For a history of the relationship between fascism and tech innovation, read Fascist Pigs: Technoscientific Organisms and the History of Fascism, by Tiago Saraiva.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Skip to toolbar