Home Firearms Why the Hard Case against Machine Learning in Military Intelligence Production is...

Why the Hard Case against Machine Learning in Military Intelligence Production is Institutional

0
91

U.S. military use of machine learning in intelligence products enjoins the military, the state, and its citizens in a sprawling public-private sector digital ecosystem, including the digital data supply chains relied on for model training. Arguments for and against the applicability of machine learning in these products are dominated by technical, legal, ethical, and organisational issues, which often obscure a fifth obstacle that trumps them all. The obstacle is institutional. It involves the transgression of the normative element of what the military does—its primary purposeful activity—via the changes we are witnessing in how it does it. Digital supply chains that fuel model building include nominally deidentified data collected via the mass surveillance of U.S. citizens’ everyday activities by private companies. These surveillance activities establish behavioural baselines derived from neutral activities, which previously represented little or no value to security communities, but from which now a variety of derivatives can be produced. These data derivatives include behavioural anomaly detection for target identification. This shift manifests an overlooked institutional fault-line in civil-military relations of profound implications. The normativity of state-sanctioned killing is at stake.

New Tech, New Enterprise, and a New Institution

In a 1999 RAND monograph on the American military enterprise in the digital information age, Carl H. Builder identifies a point of tension in civil-military relations that has deepened since that publication and remains unresolved. His central point was that the transformative nature of the digital information age means the U.S. military enterprise will not simply be applying new tools and methods to existing roles and missions. Rather, it will become a new enterprise with new roles and missions.[1] The new enterprise will enjoin a new relationship with the society from which its resources and mandate are drawn. The recent development of machine learning models by private companies, which are trained on large data sets made possible by mass surveillance of everyday civilian interactions, means observers, scholars, and practitioners can now begin to make sense of that new relationship and assess its implications for the military, society, and the state.

The Normativity of Killing

As John Keegan writes in his 2011 A History of Warfare, warrior culture follows society, but at a distance.[2] Trends in economics and society make their way in and out of military affairs, but the transfer is never total, nor are the effects uniformly distributed across sectors. But the cultural gap between the military and society is most stark when considered at an institutional level. Disrupting business models and social norms in entertainment, commerce, social life, and epistemology is categorically distinct from disrupting the business of killing. The military enterprise remains institutionally alienated from society, because its primary purposeful activity is the sanctioned killing of human beings. The modern state has traditionally been the institutional custodian of these foundational normative relations.[3]

The military’s resources and mandate are both drawn from, and negotiated with, society in formal and informal ways. In the U.S., society’s formal stake is negotiated via the legal restrictions it imposes on the military, and the political control of its roles and missions in peace and war by executive government, as well as congressional control over budgets. The normative stake is manifest informally. At an intimate level, the families, loved ones, and communities from which military personnel are drawn and put in harm’s way care about the meaning of what they do, why they do it, and how they do it. Society at large is like this but scaled up. The act of killing is considered justifiable and legitimate in democratic, rule-of-law societies such as the U.S. when the values and norms attached to its unique form of political community are expressed institutionally.

Militaries also seek normative grounding, among other considerations, for their roles and missions in the character of the society they swear to defend. Anecdotally, the serving members of no other U.S. institution express greater normative investment in the flag on the shoulder of their combat fatigues and the declared values it represents. Rule-of-law, the inalienable rights of the individual, freedom from coercive oppression, the democratic right to speak truth to power—these are meaningful connections to the character of political community felt by the military personnel from the U.S. and Australia with whom I have met and interacted. The professional and cultural sanction against the targeting of civilians is another prominent feature of military cultural identity.

The normativity of killing as a profession is deeply woven into the institutions of the military and society. As such, the profound implications for civilian-military-state relations are not captured by the discourse on technical, legal, ethical, and organisational issues presented by military use of machine learning in intelligence products. Matters of the how and why of sanctioned killing go beyond these discourses, which notoriously tend to obscure more than they clarify, particularly in matters of ethics and technology.[4] Which brings us to the normative implications of machine learning.

Machine Learning and Military Judgement

The state has eschewed its custodial responsibilities in this regard, preferring to cheer-lead based on a host of shaky assumptions.[5] The related role of the large consulting firms in the neoliberal era is receiving renewed attention in turn.[6] While scholars such as Weiss and Mazzucato have shown the U.S. government’s role in cultivating digital technologies has not been passive, the disruptive power of digital technology has meant the capacity to control its trajectory has been highly protean.[7] The commercial machine learning industry, perhaps for obvious parochial reasons, has preferred not to address the hard question of institutional unravelling, favouring instead those discourses that surface technological and legal puzzles which, while problematic, have yet to succeed in derailing industry practices, aims, and investment streams.

The U.S. military enterprise has found itself drawn along here to some degree. The misappropriation of various insights imported across scientific disciplines has been highly influential on military discourse, effectively creating a muddle of tactical acumen and strategic aims.[8] Builder notes the more mundane reality of service parochialism and funding imperatives. On the options available to the enterprise as it adapts to the digital age, he writes, “Whether the choice is real or not may be less pertinent than the fact that there are factions within the American military that are willing to make the choice seem real to those in and out of uniform who must decide how the military should be organized and funded.”[9]

The imperative of sound military judgement cannot tolerate a long discursive based on what seems real.[10] Machine learning application to intelligence production is often sold as the enhancement of decision-making. The military is daily implored by private sector consultants and vendors to embrace a “data culture” lest they be rendered the Luddites of the digital era.[11] The allure of faster tactical-level decision-making is not, however, synonymous with better judgement, and when judgement is considered at the strategic level, the tables can turn. Data-driven strategic blindness looms for a military enterprise not alert to the hard case of institutional unravelling. Again, Builder noted as much over two decades ago: “The balancing act is how to embrace the information technologies without being institutionally undone by them.”[12]

Shoulder to Shoulder With Private Citizens

Machine learning applied to intelligence production for the military involves the training of models. The expectation is the model, after training, will ingest new data and provide enhanced insight via statistical inference. These insights can then be included in intelligence assessments if they are procured by the military user. Large data sets for training allow the models to be fine-tuned. The tuning is merely the adjustment of a host of statistical parameters internal to the model via a process called backpropagation, whereby a known outcome is fit to the training data being input. The model will often end up either over-fit or over-generalised. In the former, the model tends to treat new data it hasn’t been trained on as extraneous. In the latter, the model tends to treat new data it hasn’t been trained on as intrinsic. Either way, models are problematic when they encounter non-training data and need to be treated with scepticism. This is the bias problem and the black-box problem wrapped up together. It is typically treated as a technical obstacle.

When the entire digital ecosystem is interrogated, however, the normativity problem comes into view. Mass surveillance of neutral human behaviour has ridden alongside the digital revolution under commercial terms. Without this commercial ushering in, large data sets for model training simply would not exist. Behind extensive discourses on the privacy, ethical, and legality issues presented by the commercial turn of the digital age lurks the normative question of the citizen’s right to obscurity in a free and open society.[13] The right to obscurity had not been formalised into legal frameworks prior to the digital age because prior to the advent of ubiquitous mobile computing, it was the default condition of every citizen, whose neutral behaviour was of no commercial value. The state presided over the citizen’s default obscurity, under which specific circumstances had to be met for it to be violated. We have left this world behind.

Civilian affairs offer the military some cues on what this could mean. Viljoen theorized “horizontal data relations,” addressing how the “datafication” of everyday life at the individual level expresses effects which must be understood at the population level: “Individualist data subject rights cannot represent, let alone address, these population-level effects.”[14] The implications for civilian-military relations of this analysis have been under-represented.[15] For our purposes, let us state the problem clearly. The surveyed condition of everyday citizens is inextricably connected via horizontal data relations to the generation of statistical inference, which may lead to the end user of an intelligence product prosecuting its military mission. In other words, connecting citizens directly to killing in ways they have not been connected before. Military technical innovation does not float freely. Human institutions, as deeply woven as those enjoining the state, the military, and the citizen, should attract an even higher level of voluntary scrutiny than those adjoining civilian affairs when technological innovation is considered. So far the inverse has been true. The problem is that scrutiny will come to the military and the state, voluntarily or not.

Conclusion

International relations and security studies scholars can feel bamboozled by techno-centric discourse which tends to dominate the mainstream. This is a shame. Any sufficiently transformative technology regime will impact most consequentially at the institutional level, and when military affairs are enjoined, we are pressed to consider the normativity of killing as the foundation of the modern state and its mandate to govern free people. Further, when society asks of the military that which, in order to deliver, the latter must transform itself, what the military comes to ask of society will be commensurate. Technology which masquerades as a free pass is not free. Nowhere is this collision more urgently in need of better understanding than in the area of machine learning and its applicability in matters of military judgement.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here