Facebook and nipples: opera faces an emerging technologies predicament
Censorship highlights the structural social and cultural issues underlying AI systems.
Censorship highlights the structural social and cultural issues underlying AI systems.
Belinda Henwood
UNSW Media & Content
0432 307 880
b.henwood@unsw.edu.au
The censoring of an image of a Wagnerian opera by Facebook seems, on the surface, to be a classic case of art versus pornography. This was heightened by the fact that it was an opera, the ‘highest’ of art forms, and so the outcry was predictable, and rehashed old arguments about artistic freedom. However, there are several structural issues behind this story that point to larger social and cultural problems that have little to do with opera.
The first issue, and perhaps the most connected to Facebook’s censorship, is the problem of Justice Potter Stewart’s definition of pornography in 1964 as, “I know it when I see it”. What this suggests is that pornography is an extra-symbolic representational form – in other words, you don’t need to have seen pornography to know what it is.
The truth is that pornography, like all forms of representation, is contextual, cultural and ambiguous. Despite the recent crackdowns on pornography on Tumblr and in legislation like the Backpage.com law in America, most people live in a post-porn world where the line between porn and not-porn is blurry. Censorship by corporations like Facebook is largely because of commercial and political concerns rather than reflecting the reality of how their users think about porn.
The second and more structural issue is how these aesthetic determinations are being made. It is well known that Facebook outsources this work to a variety of countries, in conditions that have serious implications for the people who look at violent, traumatic images all day long without adequate support (listen to Vox’s podcast Today Explained ‘Friends without Benefits’, 2 March 2019).
The effect is a type of outsourcing of trauma in a similar way pollution was ‘outsourced’ in manufacturing, keeping America, Europe and Australia insulated from horrific images. The fact that ‘mistakes’ are made, such as in the opera case, not only reflects the pressures these workers are under, but also how definitions of pornography, gender and nudity are culturally specific. Cynically, for these systems to work, Facebook would have to train workers in the particular sexist code relevant to the nation the image comes from. Then an opera nipple could be determined from a hip-hop nipple.
The third issue is that as artificial intelligence (AI) starts to take over more and more of these aesthetic and cultural decisions, we need to examine how the code itself is ideological. By this I mean revealing how the various relational and statistical computations that determine something like a ‘female nipple’ have their roots in 19th century eugenic taxonomies – what we would now call ‘biometrics’. Systems such as those developed by Alphonse Bertillon for determining criminality by facial measurements of ‘mug shots’ have long since been discredited, but they still form the foundation for contemporary AI logic.
Google’s AI system which categorised certain images of black people as ‘gorillas’ indicates not a mistake in the code, but the problem with the underlying structure that relies on racist demarcations. This structural violence continues in new forms that deny, censor or label people who do not fit into the white, male standard. Until we understand how representation generates race, gender, the obscene, the explicit, we can’t address the structurally racist and sexist hierarchies that order what images we see, and how we see them.
Dr Tim Gregory is a lecturer at UNSW Art & Design.