The Politics of Algorithm Governance
The Politics of Algorithm Governance
Gavin JD Smith
Abstract
Everyday surveillance work is increasingly performed by non-human algorithms. These entities can be conceptualised as
machinic fla^neurs that engage in distanciated fla^nerie: subjecting urban flows to a dispassionate, calculative and expansive
gaze. This paper provides some theoretical reflections on the nascent forms of algorithmic practice materialising in two
Australian cities, and some of their implications for urban relations and social justice. It looks at the idealisation – and
operational black boxing – of automated watching programs, before considering their impacts on notions such as ‘the
right to the city’ and ‘the right to the face’. It will argue that the turn to facial recognition software for the purposes of
automating urban governance reconstitutes the meanings and phenomenology of the face. In particular, the fleshly and
communicative physicality of the face is reduced to a measurable object that can be identified by a virtualised referent
and then consequently tracked. Moreover, the asymmetrical and faceless nature of these machinic programs of recog-
nition unsettles conventional notions of civil inattention and bodily sovereignty, and the prioritisation given to pattern
recognition renders them amenable to ideas/ideals from phrenology and physiognomy. In this way, algorithmic gover-
nance may generate not only forms of facial vulnerability and estrangement, but also facial artifice, where individuals
come to develop tacit and artful ways of de-facing and re-facing in order to subvert the processes of recognition which
leverage these modes of biopower. Thus, the datafication of urban governance gives rise to a dynamic biopolitics of the
face.
Keywords
The black box city, surveillance, algorithmic governance, biometrics, facial recognition, biopolitics
This article is a part of special theme on The Black Box Society. To see a full list of all articles in this special theme,
please click here: https://journals.sagepub.com/page/bds/collections/revisitingtheblackboxsociety
imperative of these transformations is to render the city It will argue that the turn to facial recognition software
programmable so that it can be subject to emerging for the purposes of automating urban governance –
forms of technoscientific management. Digital technol- and exercising social control – reconstitutes the mean-
ogies, artificial intelligence and automation are touted ings and phenomenology of the face, giving rise to a
by public agencies and commercial companies alike as dynamic biopolitics.
being the pre-eminent tools for solving all manner of
known and indeterminate social and ecological crises
The work of watching in the black box city
associated with ‘cityness’: poverty, crime, congestion,
pollution, wastage, and so on (Sadowski and Bendor,
2019; Sassen, 2013). In the context of the unfolding Further fine-grained research is required to open
COVID-19 pandemic, we can observe city authorities more black boxes to reveal their divergent contents,
in countries like Russia turning to facial recognition particularly in an age when the making and profiling
software solutions as a means to govern people’s move- of visibility fields . . . is progressively the work of the
ments and mobilise swift responses to those violating computerized algorithmic code; a watching repertoire
the lockdown restrictions.2 that is pre-programmable. (Smith, 2014: 162)
As a consequence of these sociotechnical imaginaries
and transformations, everyday surveillance work in the A few years back, I concluded my ethnography on the
city is increasingly enacted by non-human algorithms. backstage watching practices and cultures of CCTV
We might conceptualise these entities as machinic camera operators in the UK by forecasting the progres-
flâneurs that engage in practices of distanciated flânerie: sive displacement of these human surveillance workers
subjecting the flows of the urban environment to a dis- by non-human algorithmic equivalents. The research
passionate, mechanistic and calculative gaze. In con- was conducted at a time when notions of the ‘sentient’
trast to their human and pre-mediated precursors, and ‘smart’ city were starting to grip the attention, par-
this witnessing is orientated less by pleasure, aesthetics, lance and activities of urban planners and managers (as
spontaneity and curiosity, than by more instrumental- well as industry technologists and marketers).
ised motivations: to objectivate a growing collection of Significant developments were occurring in the design
urban phenomena and circulations so they can be and uptake of networked sensor devices, and machine
better governed. While the flâneur imagined by learning software and data science were becoming
Walter Benjamin was physically, temporally and viscer- increasingly potent. Part of this drive to marginalise
ally located in the space and action encountered, the the presence of human surveillance operators in various
machinic version conversely projects a presence into ‘centres of calculation’ (Latour, 1987) was undoubtedly
the space while assuming an intangibility from – and motivated by a desire to lower operating costs and
indifference to – it. Acquiring stimulus from the man- maximise system efficiency levels, but also by cunning
ifold sensors and digital technologies that track and marketing on the part of those vending the algorithmic
intermediate urban relations, the algorithms constitut- ‘solutions’. Algorithms, for instance, do not require to
ing practices of distanciated flânerie are expected to be remunerated for their continuous hours of observa-
perform several analytical repertoires simultaneously, tion and notification work; they do not belong to
from facial, number plate and even silhouette recogni- unions; they do not get sick; and they do not request
tion to congestion tracking, social interaction analysis leave. Moreover, they are manipulable and only require
and remote coordination of critical infrastructures. The minimal supervisory oversight.3 But the turn to soft-
rationale for this software is to further economise, ware solutions was equally precipitated by their statis-
commercialise and securitise urban dynamics, and to tical basis and thus scientific genealogy, and their
generate insights for actionable responses. And yet, calculative and unfeeling nature. That is to say, algo-
the nature, arrangement and operativity of these mech- rithms do not imbue what they witness with emotion
anisms remains entirely unintelligible and unaccount- and meaning, and they do not get adversely affected by
able to the publics and public spaces they routinely their labour. Instead, they can simultaneously and dis-
expose and modulate. passionately enact multiple repertoires of listening and
This paper provides some theoretical reflections on watching, before automating a response.
the nascent forms of algorithmic practice materialising All of this is in contradistinction to the discerned
in the Australian cities of Darwin and Perth, and some physiological and psychological limitations of their
of their implications for urban relations and social jus- human counterparts, in terms of the various con-
tice. It looks at the idealisation – and operational black straints of (a) the human eye and brain, which can
boxing – of automated watching programs, before con- only focus on and process limited stimuli for restricted
sidering their impacts on notions such as ‘the right to periods of time, and (b) humans performing pattern
the city’ and, in some instances, ‘the right to the face’. recognition at scale (identification work) and
Smith 3
memorising – also being able to instantaneously recall paradoxical visibility as discursive typifications in mar-
– infinite details.4 Indeed, video analytic software has keting and governmental discourses. He points to the
an additional entrepreneurial advantage vis-à-vis the hyped claims made by advocates of algorithmic solu-
potentially infinite number of hours of footage gener- tions, who accentuate the superior scope, objectivity,
ated from CCTV camera systems that has historically efficacy – and scientific foundations and prophetical
gone – and continues to go – unseen and thereby features – of machinic and artificial intelligence, and
wasted: it has the capacity to extract surplus value who praise – and evangelicalise – the virtues of auto-
from rapid processing of captured visual images, be mation. As he notes, ‘The allure of the technology is
that in retrospect or in real time (qua the correlations clear — the ancient aspiration to predict the future,
identified, intelligence generated and time/labour tempered with a modern twist of statistical sobriety’
savings made). Moreover, various high-profile data (Pasquale, 2015: 216). Crucially, Pasquale demon-
breach scandals in recent times have exacerbated strates the way these entities operate in accordance
public concerns regarding the conduct of human sur- with how they have been historically, socially and
veillance workers.5 organisationally scripted, functioning to both compress
A further, perhaps more significant, factor explain- and distort material phenomena and processes, into
ing the replacement of humans by non-humans in prac- metrics, scores and profiles, which tend to reflect,
tices of watching, beautifully illustrated in the above reify and reproduce the interests of privileged groups
epigraph, is the notion that is more palatable – and which ensure resources and penalties are unevenly
maybe even comforting – for the watched to imagine distributed. That is to say, major insights from his anal-
an automated computer program shadowing their ysis are the invisible, and yet ubiquitous, character of
movements, than an unknown, and intrinsically subjec- algorithmically driven power; the potential for these
tive and judgemental, set of human eyes. There is an systems to intensify social injustices; and the cunning
interesting evocation in Josh Sattler’s opening state- promotional discourses which frame algorithms as a
ment that scientifically derived, machine-driven profil- panacea to social crises, helping them colonise social
ing has a depersonalised and value-free orientation, relations and activities in proliferating domains.
and it is thereby more impartial – and less intimidating Neoliberal urbanism is a very receptive milieu for the
and fallible – than a human profiler. Publics are invited implementation of such infrastructure, and this paper
to invest trust (and substantial taxes) in the purity of adapts Pasquale’s broader metaphor to conceptualise
the numbers and calculative rules underpinning pro- ‘the black box city’, where urban flows and relations
cesses of automation, and are presumed to prefer the are increasingly registered, recognised and modulated
more anonymised relations of ‘civil inattention’ by agentive sensors animated by unseen software.
(Goffman, 1972: 385) these codes purportedly generate. Importantly, the black box city manifests under a cor-
Machinic automation, it is intimated, has a virtuous- poratised nomenclature of ‘smartness’, the guiding
ness about it, helping curb the arbitrariness of human mantra of which is enhanced visibility, automation
interference and mediation. and optimisation of urban processes, and the progres-
Frank Pasquale’s (2015) influential book, The Black sive displacement of humans from the apex in hierar-
Box Society, is instructive in this regard, offering a chies of governing. An outcome of this arrangement is
compelling account of the political, economic, social the steady disappearance of accountability in urban
and cultural circumstances that have given rise to the governance.
pre-eminence of code as a major force of social orga- In what follows, I wish to suggest that contemporary
nisation, while normalising the kinds of techno-utopian regimes of data-driven/non-human urban governance
sentiment and solutionism seen above. Pasquale uses manifest an absent–presence. This quality and feature
the notion of ‘black boxing’ to describe how social make them exceptionally hard for outsiders to tangibly
and economic power is exercised and leveraged by cor- observe, understand and interrogate. Thus, although
porations and institutions in ways which have become the devices (e.g. like smartphones, air sensors and
increasingly complex, opaque and unaccountable. CCTV cameras) that intermediate everyday urban
A notorious example of this process was the actions have a discernible, if increasingly unexception-
Australian government’s use of the now discredited al, material presence in the city, the immaterial codifi-
‘Robodebt’ program, a scheme which saw thousands cations (what Mackenzie, 2006 refers to as ‘secondary
of individual welfare recipients algorithmically lum- agency’) orientating them are distinctly absent. As devi-
bered with erroneous debts and penalties that derived ces, data and codes stretch between theatres of the
from unlawful practices of ‘income averaging’.6 street and centres of calculation, they come to
Pasquale elucidates the latent nature of algorithmic embody both a foregrounded (public) and a back-
processes, and the logics of secrecy and illegibility grounded (private) form. Moreover, when melded to
they typically embody, while also highlighting their the reductionist, promissory discourse of smartification
4 Big Data & Society
– where smartness is a referent for ‘virtue’ and is con- as they engage in surveillance work, either in situ or
stitutive of a set of dividends – the complex technical retrospectively. But with the rise and use of the latter,
scripting and operating procedures of these systems are which are predominantly immaterial and indeterminate
further black boxed under a veneer of progress and in form and action, this potential has been practically
deliverance. This discursive framework serves to foreclosed.
obscure: (a) how such codifications are derived from
selective and simplified constructions of the past and
(b) the kinds of future-ing they do and will likely con-
The machinic flâneur: Automating control
tribute to. Somewhat ironically, given the metaphor of and its implications for ‘the right to the
enlightening which accompanies the discourse of city’ and ‘the right to the face’
smartness, this leaves publics in the proverbial darkness In recent times, a number of newspaper articles and
regarding how non-human surveillance work is actually local council press releases have reported the turn to
being conducted, and without the necessary informa- data-driven and algorithmic solutions by urban offi-
tion – or deliberative scope – to assess its purpose, cials in the Australian capital cities of Perth, WA,
value and fairness; or to effectively challenge its pres- and Darwin, NT. Both of these places are experiment-
ence and legitimacy.
ing with and implementing various digital and sensor-
While significant, this situation is not new. The
ised measures, and algorithmic analytics, in a bid to
details of previous urban governance measures,
generate what proponents of these systems present as
like the introduction of CCTV camera networks
a set of efficiency, security and insight dividends:
into public space, were also rendered opaque to the
public: in terms of the evidence-base and methodolo-
We feel this will enable us to respond quicker to com-
gies on which assertions were derived; and the broader
munity safety concerns and also enable us to react to
funding, operating and accountability arrangements of
situations, such as missing children et cetera . . . in a
the systems (Smith, 2014). It has since transpired that
quicker, more efficient way . . . Facial recognition tech-
the hyperbolic claims made by self-interested parties
nology assists frontline staff in responding faster to
regarding CCTV’s impact on reducing crime and dis-
potential incidents and undertaking preventative
order have not been realised by the technology’s
measures.8
human-mediated operating realities (see Norris and
Armstrong, 1999; Smith, 2014). As a consequence,
Daniel High, City of Perth
the vast sums of public monies invested and private
profits made in the course of their implementation
[It will tell us] where people are using Wi-Fi, what
and operation have been called into question.7 And
yet, notwithstanding the apparent failure of this tech- they’re using Wi-Fi for, are they watching YouTube
nocratic policy to actualise the promises made, we see a etc, all these bits of information we can share with
similar modus operandi being mobilised by urban businesses . . . We can let businesses know ‘hey, 80 per
authorities and legislators to craft their mono-vision cent of people actually use Instagram within this area
of city smartness: a smartness that is almost entirely of the city, between these hours’.9
contingent on the assumed superior techno-cognitive
capacities, attentiveness and responsiveness of non- Josh Sattler, Darwin Council
human agents, be they data or codes. A distinguishing
feature of this deterministic discourse is the progressive Another rationality provided for these systems is the
purging of human contaminants – and thus perceived symbolic and economic importance of leading what
fallibilities – from the machinic repertoires of surveil- might be framed as the national ‘race to smartness’.
lance work. Notwithstanding the long tradition of Perth authorities have been deploying facial recogni-
scholarship which accentuates the socio-cultural shap- tion software on some of their public space CCTV
ing of all technical apparatuses – from their design and cameras, as well as developing video analytic capabili-
development, to their implementation and operational- ties to count and track people, vehicles and cyclists,
isation – there is an important methodological and eth- and predict crowd and anti-social behaviour.10
ical implication of this move from human to non- Darwin, meanwhile, has installed ‘poles fitted with
human forms of watching. While the former has been speakers, cameras and Wi-Fi [which] allow council to
physically situated behind screens in closed settings, gain data on how many people walk on what footpaths
there has always been the possibility of doing empirical and where they use certain websites and apps in the
studies on the symbolic and material dimensions of city’,11 and it has introduced 24 environmental sensors
surveillance practices, exploring the situated meaning and smart parking sensors to monitor various urban
worlds and socio-technical actions of embodied agents flows. These developments, and their situation within
Smith 5
particular models of neoliberal urban governance, given to pattern recognition renders them amenable
bring significant implications for meanings and uses to ideas/ideals from phrenology and physiognomy.
of public space, as well as for democratic principles There is a distinct sense that a person’s sovereignty
and civil rights such as privacy, anonymity, account- over the materiality of their face is progressively
ability and equity. They enable urban authorities to use eroded by the presence and agentive powers of its sim-
non-human surveillance workers to ambiently, but ulated derivative, an entity as likely scraped from pre-
anonymously and unilaterally, track the mobilities of vious social media activity as scanned from official
citizens as they enter and transit through space. On documentation. This situation evidently produces sig-
some of these systems, biometric capabilities enable nificant power differentials, leaving the exposed sub-
the linking of scanned faces to various backgrounded jects without recourse to contest their images from
databases, the one in Perth tellingly described as a being biobanked for recognition analytics, revenue gen-
‘Black Watchlist’: ‘It proposes to use the system in eration and the flexing of biopower. As Civil Liberties
East Perth to detect known troublemakers and people Australia president Kristin Klugman recently told a
wanted by the police on a council-run “Black Parliamentary enquiry:
Watchlist.”’12 Who is on the list and for what reason,
and how/whether people can get off it, is a matter To override a person’s consent in the name of safety
veiled in opacity. By virtue of this, and in combination and security is not respecting Australian adults; it is
with other system capacities which permit the scraping treating them as you would a child and so is the very
and assembling of metadata extracted from smart- definition of paternalism . . . it is only a matter of time
phones, these machinic flâneurs will be able to gradu- before obscuring one’s face in public becomes a
ally detect and memorise the identities and potential crime.13
habits of all street actors, and likely conduct forms of
trend analysis on aggregations, breaking down who In this way, algorithmic governance may generate not
uses what apps, when and where. Crucially, it appears only forms of facial vulnerability and estrangement
that this can be done without the knowledge and con- (where the scanned face betrays bodily presence, gen-
sent of those to whom the data derives, and over whom erates misrecognition or is deterministically profiled),
such currencies can be consequently leveraged in dif- but also facial artifice, where individuals come to devel-
ferentiated and differentiating ways. op tacit and artful ways of de-facing (obstructing facial
In the case of Perth, the turn to recognition software accessibility) and re-facing (changing facial appear-
for urban governance initiates a complex biopolitics of ance) in order to subvert the processes of recognition
the face. The face is transformed from its being a flesh- which leverage these modes of biopower. While bio-
ly, expressive and indeterminable medium – something metric surveillance systems might assume a stable
we are all literally and metaphorically attached to – to object to be instrumentally determined, people are
its becoming an immaterialised static object for target- quite literally attached to the materiality of their face,
ing, measurement and identification – something we are and they subjectively experience this bodily border as
separated from as it is converted into a set of machine- the primary stage on which social relations and inter-
readable binary numbers. In this process, the meaning action orders are intermediated. Indeed, people invest
and phenomenology of the face is reconstituted. In considerable time working on their faces (beautifying
scanning crowds of faces or faces in the crowd, the them, as well as manipulating the sentiments and
recognition software instantaneously detaches impres- impressions they communicate), and some are likely
sions of the faces encountered. It converts these into a to experience the indiscriminateness, formality and
disembodied and one-dimensional simulacrum which asymmetry of facial recognition analytics as a form of
then enters into a hidden and encoded dialogue with dispossession. Moreover, individuals believe in their
the virtualised faces situated in the database. If a match autonomy to subjectively disguise, decorate, cover or
exists, the disembodied face is instantaneously identi- cosmetically alter the appearance of their faces contin-
fied, meaning the embodied face can then be anchored gent on social situation, and demand barriers to facial
to an institutionalised identity, but also tracked in real accessibility for a range of cultural, religious, biomed-
time. Therefore, the facial simulacrum helps to betray ical, psychological or interpersonal reasons. Faces,
its bearer’s presence and biography, combining with therefore, are not merely passive or innate objects for
other techniques to detect and modulate mood or to mechanized modes of scanning and tracking, but rather
determine how that individual will be treated. The complex masks that are imbued with considerable sym-
asymmetrical and faceless nature of these machinic bolic meaning and social significance. In this way,
programs of recognition unsettles the notions of civil facial recognition programs fundamentally impinge
inattention and bodily sovereignty that Goffman on the right to the face, and for this reason – and the
(1963) noticed in his studies, and the prioritisation myriad ways they unilaterally mine and exploit the face
6 Big Data & Society
– we can expect these technologies of governance to New surveillance technologies are regularly tested on
mobilise considerable resistance. marginalised communities that are unable to resist their
More than merely recognising and cross-matching intrusion. (Magnet and Gates, 2009: 11)
the faces and appearances of moving entities with
their static digital identifiers, the machinic flâneurs are When considering how these governance programmes
capable of projecting a disembodied voice onto the are likely to be arranged and how their everyday oper-
street via the audio speakers, and enacting virtual fenc- ations will be socially mediated and felt, it is worth
ing around particular places regarded as high risk or highlighting a few contextual details. Each being locat-
value. If these perimeters are transgressed, an automat- ed on Australia’s furthermost west and north coasts,
ic response is duly triggered. As City of Darwin’s both Darwin and Perth have materialised as municipal-
General Manager of Innovation, Growth and ities on the back of centuries of white settler colonial-
Development Services explains, ‘We’ll be getting sent ism, where practices of violence, dispossession,
an alarm saying “there’s a person in this area that displacement, persecution and assimilation have been
you’ve put a virtual fence around” . . . boom an alert effected by the white colonies on Indigenous peoples.
goes out to whatever authority, whether it’s us or In both areas, there are high concentrations relative to
police to say “look at camera five.”’ This is a prime the population of Aboriginal and Torres Strait Islander
example of commercially generated code colonising residents, and significant levels of racialised division
public space with proprietary models and rendering it and inequity, in terms of the disproportionate numbers
both extractable and defensible. This kind of artificial of Aboriginal people who are incarcerated, homeless,
on benefits or experiencing chronic health issues.14 As
‘walling’ or arbitrary division of the city undermines
Lobo writes (2013: 456) with reference to Darwin,
urban ideals such as the right to the city, and it
‘global and local discourses interpellate asylum seekers
impinges on civic values and liberties such as freedom
and socio-economically disadvantaged Aboriginal
of movement and freedom of assembly and association.
people as dehumanised, burdensome welfare depend-
With all these devices exhaustively tracking and tracing
ents and toxic bodies who invade and contaminate
social activity, it is not unreasonable to refer to the
the space of the city and nation’. Given the cultural
post-public surveillance space of contemporary cities,
histories and social demographics of each city, and
where bodies – much like vehicles entering toll roads
the influence of Chinese governmental models on the
– must pay a price for their presence: that levy being
‘Switching on Darwin’ scheme15 – where municipal
hypervisibility qua the data traces they continuously
authorities in China are experimenting with facial rec-
leak, but also the progressive reduction of the person ognition and other surveillance programs to enact a
and their biography to a sequence of de-contextualised social credit system for dealing with social, political
flows. A person’s inability to evade monitoring and and viral risks, it is perhaps not unreasonable to
tracking, and need to manage the performativity of assume that these systems will disproportionately
leaky data impressions they cannot fully know and target and punish people of colour, and the urban
control, has implications for their right to rest and lei- poor more broadly. But because of the algorithmic
sure: to access and occupy a ‘free’, anonymous and and closed nature of the governance circulations, this
restorative space devoid of surveillance, and various potential outcome will be difficult to ascertain, as it is
‘data work’ obligations. As I have previously argued technically and legally challenging to hold the second-
(Smith, 2016), surveillance work is performed as much ary agency of coded decision-making to account. These
by watchers as by the watched, and it is an exhausting, concerns are further concretised when one considers
stress-inducing and often immaterial mode of labour. the lack of (a) reliable empirical evidence presented to
Given previous empirical research has consistently show that these systems are needed, will work and are
documented the subjectively driven and thereby preju- subject to appropriate oversight, and (b) adequate
dicial nature of surveillance operativity, it is not hard public engagement and community consultation that
to imagine distanciated flânerie being configured to dif- has been conducted in the lead up to these technologies
ferentially profile targeted faces and bodies, breaching being deployed. As The Guardian (June 12, 2019)
in the process the right of equality: recently reported:
The gaze of the cameras does not fall equally on all The [Perth] council confirmed that authorities would
users of the street but on those who are stereotypically not be required to provide any kind of warrant when
predefined as potentially deviant, or through appear- asking for the facial recognition capabilities to be
ance and demeanour are singled out by operators as switched on, only to provide an image or series of
unrespectable. (Norris and Armstrong, 1999: 201) images of a person of interest.
Smith 7
The facial recognition aspect of the new camera net- diagrams – by taking faces and movement as their
work was news to me. I feel that the community should objects of governance – encroach on civil and bodily
have been better informed. liberties, and are socially and culturally situated
within highly racialised and neoliberalised histories
Lyn Schwan, East Perth Community Safety Group which they are only likely to reify and perpetuate in
Secretary various ways. But of course, they also act to trans-
form the phenomenology and politics of being in
Very few people in the community are aware of this space: where physical presence is exhaustively tracked
trial . . . We haven’t been asked for consent. This is our and registered via the contours of the virtualised face
own biometric data, as unique as our DNA, and there (so material faces can be individually identified and
has been no consultation or permission obtained from subjects nudged to different ends depending on how
Perth’s constituents to capture this data or track us as they are scored and what aggregations they are con-
we walk down the street. There is no opt-out function, sequently allocated to), while more specialised assess-
and no choice for participation by minors. My children ments and inferences can, on the back of
are supposed to grow up under these circumstances. phrenological and physiognomic knowledges and
I haven’t consented to this, and neither have they. practices, be potentially made about belonging, dispo-
sition, mood, intent, wellness and the like.
Lauren Mac, Perth Resident It will likely prove difficult to conduct close obser-
vation on regimes of algorithmic governance for a
But the beleaguered [Perth] authority . . . is refusing to number of methodological and political reasons. For
say exactly which cameras will screen citizens because instance, researchers will need to have requisite techni-
of ‘security reasons’.16 cal competencies to apprehend the codes scripting the
machinery. And they will need access to gatekeepers
Moreover, at the time of writing, Perth is being tempo- who are comfortable with the idea of accountability,
rarily managed by Government-appointed commis- and who will tolerate the inquisitive – and potentially
sioners following the suspension of the city’s council troubling – presence of an outsider. Moreover, the
and mayor in early 2018 on various allegations, and highly abstract nature of algorithmic governance
so there are also issues of due process and accountabil- makes it hard for publics to comprehend, let alone
ity to preside over in terms of how important social and verbalise, how they are being dominated through
infrastructural decisions are formulated and justified. these techniques. Notwithstanding the promissory
techno-utopian discourses that emphasise the inevita-
Conclusion: The struggle for rights in the bility and efficacy of these modes of urban governance,
in practice – and on the basis of much STS research –
black box city we can anticipate machinic operations being mediated
This paper has analysed some of the drivers and by considerable negotiation, messiness, failure, inad-
implications of surveillance work being outsourced to vertence and friction, both in terms of how these socio-
non-human algorithmic contractors to perform. It has technical enterprises actually interface with local
conceptualised how these corporatised and machinic geographies, cultural politics and environmental fac-
flâneurs engage in practices of distanciated flânerie: tors, and how they attempt to reduce, profile and
subjecting relations and flows within the urban envi- encode in highly technical ways what are irreducible
ronment to a dispassionate, calculative and expansive and unwieldy social forces, actions, experiences and
gaze. The focus has been on the emerging black box realities. Therefore, as with all telemetric systems (see
city, where techno-utopian discursive frameworks Smith and O’Malley, 2017), we can expect a series of
and economic/extractive models of smartness reign struggles to ensue over both the right to the city – and
supreme, and where a biopolitics of the material and various social groups feeling they are being purposeful-
immaterial face is eventuating as a consequence of ly programmed out of this space by these sociotechni-
desires to better recognise and govern bodies moving cal assemblages – and the right to the face – and various
in space. This biopolitics concerns the blanket machinic individuals, such as Lauren Mac, defending the sanctity
tracking of publics through the ‘wearable technology’ of the face from colonisation by its virtualised referent.
of the face: an immoveable and inescapable communi- That these faceless systems of governance will dispas-
cative medium that is intrinsic to the embodied subject, sionately stare into the face of rising levels of social
and that, as Erving Goffman was at pains to illustrate, insecurity, as they disproportionately recognise and
is sacred to the socialities of everyday life. We have track the faces most affected by this condition, is a
considered a few of the many social impacts and con- cruel pun not lost on those targeted by them. As with
sequences of these transformations, not least how such many preceding technologies of power, the core aim of
8 Big Data & Society
Norris C and Armstrong G (1999) The Maximum Smith GJD (2014) Opening the Black Box: The Work of
Surveillance Society: The Rise of CCTV. Oxford: Berg. Watching. Abingdon: Routledge.
Pasquale F (2015) The Black Box Society: The Secret Smith GJD (2016) Surveillance, data and embodiment:
Algorithms That Control Money and Information. On the work of being watched. Body & Society 22(2):
Cambridge: Harvard University Press. 108–139.
Sadowski J and Bendor R (2019) Selling smartness: Corporate Smith GJD and O’Malley P (2017) Driving politics: Data-
narratives and the smart city as a sociotechnical imaginary. driven governance and resistance. The British Journal of
Science, Technology, & Human Values 44(3): 540–563. Criminology 57(2): 275–298.
Sassen S (2013) Does the city have speech? Public Culture
25(2): 209–221.