During armed conflict, unequal power relations and structural disadvantages derived from gender dynamics are exacerbated. There has been increased recognition of these dynamics during the last several decades, particularly in the context of sexual and gender-based violence in conflict, as exemplified for example in United Nations Security Council Resolution 1325 on Women, Peace, and Security. Though initiatives like this resolution are a positive advancement towards the recognition of discrimination against women and structural disadvantages that they suffer from during armed conflict, other aspects of armed conflict, including, notably, the use of artificial intelligence (AI) for targeting purposes, have remained resistant to insights related to gender. This is particularly problematic in the operational aspect of international humanitarian law (IHL), which contains rules on targeting in armed conflict.

The Gender Dimensions of Distinction and Proportionality

Some gendered dimensions of the application of IHL have long been recognized, especially in the context of rape and other categories of sexual violence against women occurring during armed conflict. Therefore, a great deal of attention has been paid in relation to ensuring accountability for crimes of sexual violence during times of armed conflict, while other aspects of conflict, such as the operational aspect of IHL, have remained overlooked.

In applying the principle of distinction, which requires distinguishing civilians from combatants (only the latter of which may be the target of a lawful attack), gendered assumptions of who is a threat have often played an important role. In modern warfare, often characterized by asymmetry and urban conflict and where combatants can blend in with the civilian population, some militaries and armed groups have struggled to reliably distinguish civilians. Due to gendered stereotypes of expected behavior of women and men, gender has operated as a de facto “qualified identity that supplements the category of civilian.” In practice this can mean that, for women to be targeted, IHL requirements are rigorously applied. Yet, in the case of young civilian males, the bar seems to be lower – gender considerations, coupled with other factors such as geographical location, expose them to a greater risk of being targeted.

An illustrative example of this application of the principle of distinction is in so-called “signature strikes,” a subset of drone strikes adopted by the United States outside what it considers to be “areas of active hostilities.” Signature strikes target persons who are not on traditional battlefields without individually identifying them, but rather based only on patterns of life. According to reports on these strikes, it is sufficient that the persons targeted “fit into the category ‘military-aged males’, who live in regions where terrorists operate, and ‘whose behavior is assessed to be similar enough to those of terrorists to mark them for death.’” However, as the organization Article 36 notes, due to the lack of transparency around the use of armed drones in signature strikes, it is difficult to determine in more detail what standards are used by the U.S. government to classify certain individuals as legal targets. According to a New York Times report from May 2012, in counting casualties from armed drone strikes, the U.S. government reportedly recorded “all military-age males in a strike zone as combatants […] unless there is explicit intelligence posthumously proving them innocent.”

However, once a target is assessed as a valid “military objective,” the impact of gender is reversed in conducting a proportionality assessment. The principle of proportionality requires ensuring the anticipated harm to civilians and civilian objects is not “excessive” compared to the anticipated military advantage of an attack. But in assessing the anticipated advantage and anticipated civilian harms, the calculated “military advantage” can include the expected reduction of the commander’s own combatant casualties as an advantage – in other words, the actual loss of civilian lives can be “offset” by the avoidance of prospective military casualties. This creates the de facto result that “the lives of combatants, the vast majority of whom are men,” are weighed as more important “than those of civilians” – who in a battlefield context, are often disproportionately women. Taking these applications of IHL into account, we can conclude that a gendered dimension is present in the operational aspect of this branch of law.

AI Application of IHL Principles

New technologies, particularly AI, have been increasingly deployed to assist commanders in their targeting decisions. Specifically, machine-learning algorithms are being used to process massive amounts of data to identify rules or patterns, drawing conclusions about individual pieces of information based on these patterns. In warfare, AI already supports targeting decisions in various forms. For instance, AI algorithms can estimate collateral damage, thereby helping commanders undertake the proportionality analysis. Likewise, some drones have been outfitted with AI to conduct image-recognition and are currently being trained to scan urban environments to find hidden attackers – in other words, to distinguish between civilians and combatants as required by the principle of distinction.

Indeed, in modern warfare, the use of AI is expanding. For example, in March 2021 the National Security Commission on AI, a U.S. congressionally-mandated commission, released a report highlighting how, in the future, AI-enabled technologies are going to permeate every facet of warfighting. It also urged the Department of Defense to integrate AI into critical functions and existing systems in order to become an AI-ready force by 2025. As Neil Davison and Jonathan Horowitz note, as the use of AI grows, it is crucial to ensure that its development and deployment (especially when coupled with the use of autonomous weapons) complies with civilian protection.

Yet even if IHL principles can be translated faithfully into the programming of AI-assisted military technologies (a big and doubtful “if”), such translation will reproduce or even magnify the disparate, gendered impacts of IHL application identified previously. As the case of drones used to undertake signature strikes demonstrates, the integration of new technologies in warfare risks importing, and in the case of AI tech, potentially magnifying and cementing, the gendered injustices already embodied in the application of existing law.

Gendering Artificial Intelligence-Assisted Warfare

There are several reasons that AI may end up reifying and magnifying gender inequities. First, the algorithms are only as good as their inputs – and those underlying data are problematic. To properly work, AI needs massive amounts of data. However, neither the collection nor selection of these data are neutral.  In less deadly application domains, such as in mortgage loan decisions or predictive policing, there have been demonstrated instances of gender (and other) biases of both the programmers and the individuals tasked with classifying data samples, or even the data sets themselves (which often contain more data on white, male subjects).

Perhaps even more difficult to identify and correct than individuals’ biases are instances of machine learning that replicate and reinforce historical patterns of injustice merely because those patterns appear, to the AI, to provide useful information rather than “undesirable noise.” As Noel Sharkey notes, “the societal push towards greater fairness and justice is being held back by historical values about poverty, gender and ethnicity that are ossified in big data. There is no reason to believe that bias in targeting data would be any different or any easier to find.”

This means that historical human biases can and do lead to incomplete or unrepresentative training data. For example, a predictive algorithm used to apply the principle of distinction on the basis of “target profiles,” together with other intelligence, surveillance, and reconnaissance tools, will be gender biased if the data inserted equate military-aged men with combatants and disregard other factors. As the practice of signature drone strikes has demonstrated, automatically classifying men as combatants and women as vulnerable has led to mistakes in targeting. As the use of machine learning in targeting expands, these biases will be amplified if not corrected for – with each strike providing increasingly biased data.

To mitigate this result, it is critical to ensure that the data collected are diverse, accurate, and disaggregated, and that algorithm designers reflect on how the principles of distinction and proportionality can be applied in gender-biased ways. High quality data collection means, among other things, ensuring that the data are disaggregated by gender – otherwise it will be impossible to learn what biases are operating behind the assumptions used, what works to counter those biases, and what does not.

Ensuring high quality data also requires collecting more and different types of data, including data on women. In addition, because AI tools reflect the biases of those who build them, ensuring that female employees hold technical roles – and that male employees are fully trained to understand gender and other biases – is also crucial to mitigate data biases. Incorporating gender advisors would also be a positive step to ensure that the design of the algorithm, and the interpretation of what the algorithm recommends or suggests, considers gender biases and dynamics.

However, issues of data quality are subsidiary to larger questions about the possibility of translating IHL into code – and, even if this translation is possible, the further difficulty of incorporating gender considerations into IHL code. Encoding gender considerations into AI is challenging to say the least, because gender is both a societal and individual construction. Likewise, the process of developing AI is not neutral, as it has both politics and ethics embedded, as demonstrated by documented incidents of AI encoding biases. Finally, the very rules and principles of modern IHL were drafted when structural discrimination against women was not acknowledged or was viewed as “natural” or beneficial. As a result, when considering how to “translate” IHL into code, it is essential to incorporate critical gender perspectives into the interpretation of the norms and laws related to armed conflict.

Gendering IHL: An Early Attempt and Work to be Done

An example of the kind of critical engagement with IHL that will be required is provided by the updated International Committee of the Red Cross (ICRC) Commentary on the Third Geneva Convention. Through the incorporation of particular considerations of gender-specific risks and needs (para. 1747), the updated commentary has reconsidered outdated baseline gender assumptions, such as the idea that women have non-combatant status by default, or that women must receive special consideration because they have “less resilience, agency or capacity” (para. 1682). This shift has demonstrated that it is not only desirable, but also possible to include a gender perspective in the interpretation of the rules of warfare. This shift also underscores the urgent need to revisit IHL targeting principles of distinction and proportionality to assess how their application impacts genders differently, so that any algorithms developed to execute IHL principles incorporate these insights from the start.

As a first cut at this reexamination, it is essential to reassert that principles of non-discrimination also apply to IHL, and must be incorporated into any algorithmic version of these rules. In particular, the principle of distinction allows commanders to lawfully target only those identified as combatants or those who directly participate in hostilities. Article 50 of Additional Protocol I to the Geneva Conventions defines civilians in a negative way, meaning that civilians are those who do not belong to the category of combatants – and IHL makes no reference to gender as a signifier of identity for the purpose of assessing whether a given individual is a combatant. In this regard, being a “military-aged male” cannot be a shortcut to the identification of combatants. Men make up the category of civilians as well. As Maya Brehm notes, “there is scope for ‘categorical targeting’ within a conduct of hostilities framework, but the principle of non-discrimination continues to apply in armed conflict. Adverse distinction based on race, sex, religion, national origin or similar criteria is prohibited.”

Likewise, in any attempt to translate the principle of proportionality into code, there must be recognition of and correction for the gendered impacts of current proportionality calculations. For example, across Syria between 2011 and 2016, 75 percent of the civilian women killed in conflict-related violence were killed by shelling or aerial bombardment. In contrast, 49 percent of civilian men killed in war-related violence were killed by shelling or aerial bombardment; men were killed more often by shooting. This suggests that particular tactics and weapons have disparate impacts on civilian populations that break down along gendered lines. The study’s authors note that the evolving tactics used by Syrian, opposition, and international forces in the conflict contributed to a decrease in the proportion of casualties who were combatants, as the use of shelling and bombardment – two weapons that were shown to have high rates of civilian casualties, especially women and children civilian casualties – increased over time. Study authors also note, however, that changing patterns of civilian and combatant behavior may partially explain the increasing rates of women compared to men in civilian casualties: “A possible contributor to increasing proportions of women and children among civilian deaths could be that numbers of civilian men in the population decreased over time as some took up arms to become combatants.”

As currently understood, IHL does not require an analysis of the gendered impacts of, for example, the choice of aerial bombardment versus shooting. Yet this research suggests that selecting aerial bombardment as a tactic will result in more civilian women than men being killed (nearly 37 percent of women killed in the conflict versus 23 percent of men). Selecting shooting as a tactic produces opposite results, with 23 percent of civilian men killed by shooting compared to 13 percent of women. There is no “right” proportion of civilian men and women killed by a given tactic, but these disparities have profound, real-world consequences for civilian populations during and after conflict that are simply not considered under current rules of proportionality and distinction.

In this regard, although using force protection to limit one’s own forces’ casualties is not forbidden, such strategy ought to consider the effect that this policy will have on the civilian population of the opposing side – including gendered impacts. The compilation of data on how a certain means or method of warfare may impact the civilian population would enable commanders to take a more informed decision. Acknowledging that the effects of weapons in warfare are gendered is the first key step to be taken. In some cases, there has been progress in incorporating a gendered lens into positive IHL, as in the case of cluster munitions, where Article 5 of the convention banning these weapons notes that States shall provide gender-sensitive assistance to victims. But most of this analysis remains rudimentary and not clearly required. In the context of developing AI-assisted technologies, reflecting on the gendered impact of the algorithm is essential – during AI development, acquisition, and application.

The process of encoding IHL principles of distinction and proportionality into AI systems provides a useful opportunity to revisit application of these principles with an eye toward interpretations that take into account modern gender perspectives – both in terms of how such IHL principles are interpreted and how their application impacts men and women differently. As the recent update of the ICRC Commentary on the Third Geneva Convention illustrates, acknowledging and incorporating gender-specific needs in the interpretation and suggested application of the existing rules of warfare is not only possible, but also desirable.

Disclaimer: This post has been prepared as part of a research internship at the Erasmus University Rotterdam, funded by the European Union (EU) Non-Proliferation and Disarmament Consortium as part of a larger EU educational initiative aimed at building capacity in the next generation of scholars and practitioners in non-proliferation policy and programming. The views expressed in this post are those of the author and do not necessarily reflect those of the Erasmus University Rotterdam, the EU Non-Proliferation and Disarmament Consortium or other members of the network.

Image: SANA’A, YEMEN – SEPTEMBER 19: A Yemeni man looks at graffiti protesting against US drone strikes on September 19, 2018 in Sana’a, Yemen. (Photo by Mohammed Hamoud/Getty Images)