World
This analysis delves into the impact of these systems on military strategies, innocent civilians and global perceptions, highlighting the intricate interplay between technology, warfare tactics and ethical considerations.
Updated : Jun 06, 2024, 05:07 PM IST
The Israeli military in Gaza has integrated advanced AI technology, such as the Gospel and Lavender systems, into their operations, revolutionising the conduct of conflicts by enhancing target identification and facilitating quicker military decision-making. These technological advancements are geared towards optimizing the efficiency and precision of military actions.
Nevertheless, the deployment of these AI systems raises significant ethical concerns related to civilian safety and the level of human oversight. This analysis delves into the impact of these systems on military strategies, innocent civilians and global perceptions, highlighting the intricate interplay between technology, warfare tactics and ethical considerations. It sheds light on the evolving dynamics of modern conflicts in terms of strategies, policies and their implications on human lives.
1. Employment of Gospel for Target Identification: In Gaza, the Israeli military has employed an AI system known as Gospel to pinpoint such potential targets as schools, medical facilities, places of worship and aid organization offices. Despite claims by Hamas officials citing over 30,000 Palestinian casualties—a considerable number being women and children—the Gospel system utilizes machine learning (ML) capabilities to sift through vast data sets for target identification.
The Israeli military says that, besides increasing target accuracy, this system also hastens the target selection procedure through automation. Within the initial 27 days of conflict, they purportedly targeted over 12,000 locations.
While it remains ambiguous whether funds earmarked for Israeli military technology specifically support the Gospel system implementation, Israel has also developed AI-augmented precision assault rifle sights, such as SMASH from Smart Shooter. These sights make use of advanced image-processing algorithms to seek out Gaza and occupied West Bank targets.
2. Impact of Lavender: The Lavender AI system has played a pivotal role in Israeli military operations against Palestinians during early conflict stages by generating extensive ‘kill lists’ targeting individuals affiliated with Hamas and Palestinian Islamic Jihad (PIJ). At this juncture, military officers were granted authority to utilise these lists with minimal oversight without fully scrutinizing selection rationales or underlying intelligence data. As a consequence, human supervision was limited to brief confirmations predominantly based on gender within approximately 20 seconds.
Despite an error rate hovering around 10%, occasional misidentifications occurred where individuals marginally linked—or unrelated to—militant groups were mistakenly targeted. Decision-makers often accepted these determinations without conducting further verification. The Israeli military conducted systematic attacks on targeted individuals within their residences, mainly at night when family members were, possibly, present, rather than in military settings. Automated tracking systems, like the recently disclosed ‘Where’s Daddy?’ system, took forward these AI-driven initiatives by identifying targets in conjunction with with SpaceX, Meta and Israel.
3.1 Implications & Controversies: The integration of WhatsApp data into Lavender has raised implications and controversies. According to UK-based investigative media Grey Dynamics, Paul Biggar, a respected software engineer and founder of Tech for Palestine, shared insights into Lavender’s methodologies. It is suggested that Lavender gathers intelligence from digital traces within WhatsApp groups to identify targets in Gaza, showcasing the significant influence of data mining and social media on contemporary military operations.
Reports have highlighted a troubling aspect of Lavender’s operations involving ‘pre-crime’ tactics based on WhatsApp associations with suspected militants becoming grounds for targeting individuals. Metadata extracted from WhatsApp group memberships informs Lavender’s algorithmic decision-making process.
Moreover, Meta’s involvement in transferring data to Lavender has stirred controversy. The close relationships between Meta’s leadership—including Chief Information Security Officer Guy Rosen and CEO Mark Zuckerberg—with Israel have sparked inquiries into the extent of collaboration between tech giants and defence entities.
The use of Lavender raises ethical concerns because a large number of civilians were killed. Israeli officials’ acknowledgment that they target ‘suspects’ within residential homes, leading to civilian casualties including innocent individuals and children, underscores the moral challenges presented by AI-enabled warfare.
3.2 Discussions between Elon Musk and Israeli military representatives regarding AI underscore the critical nexus between technology and national security. Although briefly mentioned in a government report, the talks reveal Musk’s deep engagement with Israeli officials on security implications of AI technologies.
The meeting, attended by senior security personnel, highlights the strategic significance attached to AI advancements in safeguarding national interests. Furthermore, collaboration between SpaceX and Israel in launching the EROS C3 reconnaissance satellite signifies Musk’s broader participation in AI-driven initiatives. SpaceX reportedly incorporates AI technology to enhance flight paths and oversee satellite networks within its operations.
The EROS C3 satellite, managed by SpaceX for deployment, plays a crucial role in Israel’s intelligence infrastructure. It offers advanced Earth observation capabilities, underscoring the strategic significance of AI in modern reconnaissance and intelligence gathering efforts.
Manufactured by Israel Aerospace Industries (IAI) and operated by ImageSat International, the EROS C3 satellite stands as a cutting-edge reconnaissance tool. Its high-resolution imaging capabilities cater to diverse government and commercial needs. Outfitted with an advanced space camera from Elbit Systems, the satellite captures detailed imagery essential for a variety of missions.
4. The adoption of this strategy resulted in a notable increase in casualties among Palestinians, especially impacting women, children and non-combatants during the initial weeks of conflict due to decisions influenced by AI systems.
Following an assault on October 7 by Hamas-led militants that led to around 1,200 fatalities and 240 abductions in southern Israeli regions, the Israeli military enacted a new approach.
Through ‘Operation Iron Swords’, the army broadened its target selection criteria to include all members of Hamas’s military branch as potential targets, irrespective of their position or direct participation in combat actions. This adjustment marked a significant escalation in the military’s response tactics.
5. This shift posed a complex hurdle for Israeli intelligence operations. Previously, authorizing the elimination of high-value targets necessitated an intricate ‘incrimination’ process involving confirming the target’s seniority within Hamas, determining their residence location, collecting contact information and accurately pinpointing their real-time whereabouts.
While focusing primarily on senior figures allowed intelligence operatives to manage each target meticulously, expanding the list to encompass thousands of lower-ranking members escalated both the complexity and scale of intelligence operations.
To address this expanded scope, the Israeli military increasingly turned to automated software and AI technologies. This transition reduced human involvement in verification processes, while empowering AI to identify military operatives and make critical decisions.
6.1 After conducting manual assessments on a random subset of several hundred targets identified by AI systems, the authorities sanctioned full implementation of Lavender’s kill lists roughly two weeks into the conflict. Findings indicated that Lavender boasted an impressive 90% accuracy rate in confirming individuals’ ties with Hamas.
The Israeli military heavily depended on the Lavender system, viewing its identification of individuals as Hamas militants as definitive. This negated the necessity for human verification of AI’s decisions or examination of the intelligence data.
Initially relaxing targeting constraints in the war’s early stages had significant impact. As reported by the Palestinian Health Ministry in Gaza, a total of approximately 15,000 Palestinians were killed within six weeks. This represented nearly half of all documented casualties until a ceasefire was enforced on November 24, according to Grey Dynamics.
6.2 Sophisticated Tracking Systems: Utilizing extensive surveillance on about 2.3 million residents in Gaza, the Lavender system collects data to assess and classify individuals based on potential ties to Hamas or PIJ military factions. On a scale of 1-100, it assesses their likely involvement in militancy.
The Lavender system operates by identifying traits common among known Hamas and PIJ members, using this information for training and applying it across the populace to detect similar characteristics. Individuals exhibiting multiple suspicious traits receive higher scores, flagging them as potential targets for elimination.
Factors that could elevate one’s score include participation in WhatsApp groups with known militants, frequent cell phone changes and regular relocations—behavior patterns hinting at possible militant connections.
6.3 Effects on Civilian Populations: In the conflict’s progression, officers were directed not to independently verify AI system assessments in order to expedite target identification processes. However, internal assessments indicated that Lavender was only about 90% accurate.
Instances arose where individuals were misidentified based on communication patterns resembling those of known militants. This led to such groups as police personnel, civil defence workers, relatives of militants, namesakes of militants, or former owners of militant-owned devices being targeted.
7. Shift in Israel’s Military Approach: Before bombing suspected ‘junior’ militant residences flagged by the Lavender AI system, human operators merely verified the target’s gender. The assumption was that, if the target was female, it was likely an error by AI as there were no women in the military branches of Hamas. This indicates that most decisions were made by AI with minimal human oversight.
In the subsequent phase of the Israeli army’s assassination process, efforts are concentrated on pinpointing the precise whereabouts of targets pinpointed by Lavender. Despite official statements, a key reason for the high death toll in the ongoing bombings is the military’s choice to strike these individuals at their residences, often with their families. This approach facilitates the AI system in selecting family homes for bombing missions, resulting in increased casualties.
7.3 Impact of Targeting Practices: In contrast to situations where Hamas combatants have operated from civilian sites, these targeted assassination operations have largely focused on suspected militants within their homes where there is no evidence of military activities taking place. This decision underscores how Israel’s surveillance systems in Gaza enable linking individuals to their family dwellings.
To facilitate this process, developers have created sophisticated software, including a tool named ‘Where’s Daddy?’ which tracks and alerts when individuals return home, enabling precise timing of bombings. The number of families completely obliterated in their residences during this conflict has substantially risen compared to the 2014 conflict, indicating a significant escalation in employing this strategy, according to reports by Grey Dynamics.
7.3 ‘Where’s Daddy?’ Monitoring System: With a decline in assassinations, the authorities began inputting more targets into monitoring systems like ‘Where’s Daddy?’ which tracked people entering their homes, thus making them potential sitting ducks for an airstrike. Determinations on whom to include in these monitoring systems could be made by lower-ranking officers.
At the outset of hostilities, within the initial two weeks, there were ‘several thousand’ targets registered into programmes like ‘Where’s Daddy?’ These targets encompassed members of Hamas’s elite special forces unit, Nukhba, anti-tank operatives and those who crossed into Israel on October 7. However, over time, the list expanded significantly to encompass a broader spectrum of individuals.
8. Israel’s Military AI in Action: The integration of AI system Lavender and tracking tools, such as ‘Where’s Daddy?’, led to devastating results, with deaths occurring in entire families that were earmarked by Israeli military AI. Once a name from Lavender’s lists was entered into the ‘Where’s Daddy?’ home-tracking system, constant surveillance was initiated on that individual making him susceptible to airstrikes upon returning home—often resulting in total building collapse and loss of all occupants.
Following identification by Lavender, officials would confirm that the target was male and utilize tracking technology to pinpoint them at their residence. The subsequent step involved selecting the appropriate bomb type for the airstrike.
Typically, junior operatives identified by Lavender were assigned ‘dumb bombs’ to economize on pricier armaments. As a result, if a junior target resided in a high-rise structure, the military refrained from utilizing a more precise and costly ‘floor bomb’ to reduce collateral damage. Conversely, when the target was situated in a low-rise building, the army approved the use of a ‘dumb bomb’, potentially endangering all occupants of the building.
9. Shifts in Strategy and Global Influence: In earlier conflict phases, the authorities estimated that targeting junior operatives recognised by such AI systems as Lavender could lead to up to 20 civilian deaths per target, with a maximum cap of 15 casualties.
Currently, under American influence, the Israeli military has discontinued the practice of designating numerous junior targets for bombing within civilian dwellings, even those identified by Israeli military AI systems. This alteration has also impacted the army’s dependence on intelligence databases and automated tools for house location.
10. Civilian Fallout: Israeli Military AI Effects: Following the assaults, witnesses recounted the solemn process of recovering bodies from rubble, resulting in approximately 50 fatalities and around 200 injuries on the initial day alone. Camp residents devoted five days to searching for and rescuing those affected by, or deceased, in the incident.
In mid-December, military forces targeted a tall building in Rafah with intentions to eliminate Mohammed Shabaneh, leader of Hamas’ Rafah Brigade. While this strike led to casualties among ‘numerous civilians’, it remains uncertain whether Shabaneh was among them. Often, senior leaders seek refuge in tunnels beneath civilian edifices. Hence, airstrikes directed at them inevitably pose risks to civilians.
11. Precision Check: Israeli Military AI in Focus: Previously, the Israeli armed forces relied on automated, and frequently imprecise, methods to estimate potential civilian casualties linked with each target. In prior conflicts, intelligence personnel meticulously verified occupancy details of targeted residences and noted any potential civilian harm in a designated ‘target file’. However, starting October 7 onwards, this meticulous verification process was predominantly supplanted by automated means.
During October, specifically highlighted by The New York Times, there was a system operated from a specific base in southern Israel that gathered data from mobile devices in Gaza providing real-time assessments of Palestinian movements from north to south Gaza. The system categorized regions into colour codes: red indicating high-density population areas and green and yellow denoting less populated zones.
11.1 Tactical Implications: Impact of Israeli Military AI: A comparable system is utilized to assess unintended harm, assisting in determining targets for bombing structures in Gaza. Initially, this software evaluated the civilian population residing in each household before the conflict commenced. For instance, if the military estimated that half of the residents in a neighbourhood had evacuated, the programme would adjust its calculations accordingly, considering a home with 10 occupants as accommodating only five.
There were instances where a significant delay occurred between receiving alerts from tracking systems like ‘Where’s Daddy?’ indicating that a target had entered a residence and the subsequent airstrike. This gap in timing often led to entire families, who were not the intended targets, being wiped out.
11.2 Shifts in Post-Attack Procedures: In previous Gaza conflicts, Israeli intelligence typically conducted assessments after strikes to evaluate the damage after elimination of human targets. These assessments aimed to verify the death of senior commanders and assess civilian casualties. However, in the current conflict, particularly concerning junior militants identified through AI, the authorities bypassed this process to expedite their operations.
Conclusion
The integration of Israeli military AI systems into operations in Gaza represents a significant transformation in modern warfare with profound implications for civilian populations and international conflict standards. While these technologies offer enhanced targeting accuracy and efficiency, their deployment has also resulted in alarming levels of civilian casualties and raised crucial concerns regarding oversight and accountability.
As conflicts grow more complex and intense, it is imperative for policymakers, military authorities and the global community to engage in substantive dialogues and establish frameworks to mitigate the humanitarian impact of AI-driven warfare. Through responsible and ethical utilization of technology, we can confront contemporary conflict challenges, while upholding human rights and international regulations.
(The author of this article is a Defence, Aerospace & Political Analyst based in Bengaluru. He is also Director of ADD Engineering Components, India, Pvt. Ltd, a subsidiary of ADD Engineering GmbH, Germany. You can reach him at: girishlinganna@gmail.com)