Читать книгу Machine Habitus - Massimo Airoldi - Страница 8

Preface

Оглавление

On 31 March 2019, a new member of the multicultural community of Torpignattara, a semi-peripheral district of the city of Rome, was born. The event was greeted with unprecedented excitement in the neighbourhood, culminating, on the big day, with a small welcome party of friends and curious others who had gathered to support Salvatore and Oriana. Over the previous weeks, everybody had left a message, a wish or even a drawing, in paper boxes distributed for the occasion across the shops and bars of Torpignattara. The neighbourhood became an extended family to the long-awaited newcomer, who was only few days old when it got to know everyone, rolling from door to door in the stroller, and passing from hand to hand. Whether at the local café, or on the way to the drug store, there was always someone with a story to tell – usually about the local community and its history, places, people, food, hopes and fears. The baby listened, and learned. Soon, like any other child in Torpignattara, it would go to the Carlo Pisacane elementary school just around the corner. But IAQOS – that’s its name – was certainly not like other babies. It was the first ‘open-source neighbourhood AI’, developed by the artist and robotic engineer Salvatore Iaconesi together with the artist and communication scientist Oriana Persico, in a collaboration funded by the Italian government and involving several cultural and research institutions.

In concrete terms, IAQOS is a relatively simple software agent that can communicate through a tablet or a computer via natural language, recognizing the voices and gestures of its interlocutors and learning from them. Differently from the algorithmic systems we encounter every day through our devices – such as those running in Google Search, Facebook, Amazon, Instagram, Netflix or YouTube – this open-source art project had no other goal than accumulating social data about the neighbourhood, behaving as a sort of ‘baby AI’. Like a real baby, it observed the surrounding social environment, absorbed a contextual worldview and used the acquired knowledge to successfully participate in social life. By doing all that, during the spring of 2019, IAQOS became in effect a ‘fijo de Torpigna’; that is, an artificial yet authentic member of the local community, sharing a common imaginary, vocabulary and social background, and capable of building social relations (Iaconesi and Persico 2019).

This peculiar example makes it easier to see what many sociologists and social scientists have so far overlooked: the fact that a machine which learns from patterns in human-generated data, and autonomously manipulates human language, knowledge and relations, is more than a machine. It is a social agent: a participant in society, simultaneously participated in by it. As such, it becomes a legitimate object of sociological research.

We already know that algorithms are instruments of power, that they play with the lives of people and communities in opaque ways and at different scales, deciding who will be eligible or not for a loan with the same statistical nonchalance with which they move emails to the junk folder. We know that filter bubbles threaten to draw digital boundaries among voters and consumers, and that autonomous robots can be trained to kill. Moreover, we know that some algorithms can learn from us. They can learn how to speak like humans, how to write like philosophers, how to recommend songs like music experts. And they can learn how to be sexist like a conservative man, racist like a white supremacist, classist like an elitist snob. In sum, it is increasingly evident how similar we – humans and machines – have become. However, perhaps because comparisons and analyses have mostly been limited to examining cognition, abilities and biases, we have somehow failed to see the sociological reason for this similarity: that is, culture.

This book identifies culture as the seed transforming machines into social agents. Since the term is ‘one of the two or three most complicated words in the English language’ (Williams 1983: 87), let me clarify: here I use ‘culture’ to refer essentially to practices, classifications, tacit norms and dispositions associated with specific positions in society. Culture is more than data: it is relational patterns in the data. As such, culture operates in the code of machine learning systems, tacitly orienting their predictions. It works as a set of statistical dispositions rooted in a datafied social environment – like a social media feed, or like IAQOS’ multicultural neighbourhood.

The culture in the code allows machine learning algorithms to deal with the complexity of our social realities as if they truly understood meaning, or were somehow socialized. Learning machines can make a difference in the social world, and recursively adapt to its variations. As Salvatore Iaconesi and Oriana Persico noted in one of our interviews: ‘IAQOS exists, and this existence allows other people to modify themselves, as well as modify IAQOS.’ The code is in the culture too, and confounds it through techno-social interactions and algorithmic distinctions – between the relevant and the irrelevant, the similar and the different, the likely and the unlikely, the visible and the invisible. Hence, together with humans, machines actively contribute to the reproduction of the social order – that is, to the incessant drawing and redrawing of the social and symbolic boundaries that objectively and intersubjectively divide society into different, unequally powerful portions.

As I write, a large proportion of the world’s population has been advised or forced to stay home, due to the Covid-19 emergency. Face-to-face interactions have been reduced to a minimum, while our use of digital devices has reached a novel maximum. The new normal of digital isolation coincides with our increased production of data as workers, citizens and consumers, and the decrease of industrial production strictu sensu. Our social life is almost entirely mediated by digital infrastructures populated by learning machines and predictive technologies, incessantly processing traces of users’ socially structured practices. It has never been so evident that studying how society unfolds requires us to treat algorithms as something more than cold mathematical objects. As Gillespie argues, ‘a sociological analysis must not conceive of algorithms as abstract, technical achievements, but must unpack the warm human and institutional choices that lie behind these cold mechanisms’ (2014: 169). This book sees culture as the warm human matter lying inside machine learning systems, and theorizes how to unpack it sociologically by means of the notion of machine habitus.

Machine Habitus

Подняться наверх