Governing others: Anomaly and the algorithmic subject of security

Abstract

As digital technologies and algorithmic rationalities have increasingly reconfigured security practices, critical scholars have drawn attention to their performative effects on the temporality of law, notions of rights, and understandings of subjectivity. This article proposes to explore how the textquoteleftothertextquoteright is made knowable in massive amounts of data and how the boundary between self and other is drawn algorithmically. It argues that algorithmic security practices and Big Data technologies have transformed self/other relations. Rather than the enemy or the risky abnormal, the textquoteleftothertextquoteright is algorithmically produced as anomaly. Although anomaly has been often used interchangeably with abnormality and pathology, a brief genealogical reading of the concept shows that it works as a supplementary term, which reconfigures the dichotomies of normality/abnormality, friend/enemy, and identity/difference. By engaging with key practices of anomaly detection by intelligence and security agencies, the article analyses the materialisation of anomalies as specific spatial textquoteleftdotstextquoteright, temporal textquoteleftspikestextquoteright and topological textquoteleftnodestextquoteright. We argue that anomaly is not simply indicative of more heterogeneous modes of othering in times of Big Data, but represents a mutation in the logics of security that challenge our extant analytical and critical vocabularies.

Publication
European Journal of International Security