Data/Body: Corpus and the Cloud Empire of our Lives, Charlotte Kent, δημοσίευση The Brooklyn Rail
My last column addressed generative art, a practice in which artists often use data sets to create complex works about our world. But where does that data come from? And, more importantly, can the aestheticization of data ignore its historical context or the privacy issues of its contemporary context?
Data colonialism is the how, the extractivist processes through which life gets newly appropriated by capitalism. The social quantification sector is the who, the consortium of private and public players who engage in data colonialism to achieve their financial and political goals. And the Cloud Empire is the what, the overall organization of resources and imagination that emerges from the practices of data colonialism.
–The Costs of Connection, Nick Couldry and Ulises A. Mejias
Data is not neutral and typically produced by extraction from our lives. Artists have unveiled that opaque appropriation to help us consider how we might resist. The topic has been on my mind because recent news has brought collected data systems’ political externalities to the fore. The Federal Trade Commission Chair Lina M. Khan expressed concern about “business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used” in her proposal to consider possible rules around commercial data practices. When even the government is concerned that corporate abuse of people’s data has gone too far, we are in trouble. This dialogue has resurfaced in response to the inordinate invasion of privacy rights around reproductive healthcare in the United States, two issues that are deeply intertwined because privacy is a right that has always only been accorded to some bodies, with the rest dismissed as quanta to be sorted and sold.
A data set is called a corpus, a poetic allusion to its integrated body of information. Often, the data is an extraction of lived bodies’ experiences, but lacks in that corpus also expose cultural amputations. The artist Stephanie Dinkins just exhibited On Love and Data at the Queens Museum and much of her work is about revealing the importance of including more people and voices in the development of our technological tools. The extent of racial bias has been well-reported but Dinkins’s creative tekne enquires–and subtly proposes–how technologies could present a logics of care and mutuality rather than constant extraction. This is particularly important as AI technologies become widespread, a technology and practice that will need to be expressly addressed in future columns.
Για περισσότερα δες εδώ.