new extractivism, Vladan Joler

narrator Vladan Joler
term new extractivism
published October 2019, Novi Sad

Note: During the GCK seminar, the term New extractivism was presented by Vladan Joler. He co-authored the text with Kate Crawford for the Anatomy of an AI System project published on


At this moment in the 21st century, we see a new form of extractivism that is well underway: one that reaches into the furthest corners of the biosphere and the deepest layers of human cognitive and affective being. The stack behind contemporary technological systems goes well beyond the multi-layered “technical stack” of data modelling, hardware, servers and networks. The full stack reaches much further into capital, labour and nature, and demands an enormous amount of each. The true costs of these systems – social, environmental, economic, and political – remain hidden and may stay that way for some time.


It’s necessary to move beyond a simple analysis of the relationship between an individual human, their data, and any single technology company in order to contend with the truly planetary scale of extraction. Vincent Mosco has shown how the ethereal metaphor of “the cloud” for offsite data management and processing is in complete contradiction with the physical realities of the extraction of minerals from the → Earth’s crust and the dispossession of human populations that sustain its existence. Sandro Mezzadra and Brett Nielson use the term “extractivism” to name the relationship between different forms of extractive operations in contemporary capitalism, which we see repeated in the context of the AI industry. There are deep interconnections between the literal hollowing out of the materials of the Earth and biosphere, and the data capture and monetisation of human practices of communication and sociality in artificial intelligence (AI). Mezzadra and Nielson note that labour is central to this extractive relationship, which has repeated throughout history: from the way European imperialism used slave labour, to the forced work crews on rubber plantations in Malaya, to the indigenous people of Bolivia being driven to extract the silver that was used in the first global currency. Thinking about extraction requires thinking about labour, resources, and data together. This presents a challenge to critical and popular understandings of AI: it is hard to “see” any of these processes individually, let alone collectively.


Looking from the perspective of deep time, we are extracting Earth’s history to serve a split second of technological time, in order to build devices that are often designed to be used for no more than a few years. For example, the Consumer Technology Association notes that the average lifespan of a smartphone is 4.7 years. This obsolescence cycle fuels the purchase of more devices, drives up profits, and increases incentives for the use of unsustainable extraction practices. From a slow process of elemental development, these elements and materials go through an extraordinarily rapid period of excavation, smelting, mixing, and logistical transport – crossing thousands of kilometres in their transformation.


Drawing out the connections between resources, labour and data extraction brings us inevitably back to traditional frameworks of exploitation. But how is value being generated through these systems? A useful conceptual tool can be found in the work of Christian Fuchs and other authors examining and defining digital labour. The notion of digital labour, which was initially linked with different forms of non-material labour, precedes the life of devices and complex systems such as AI. Digital labour – the work of building and maintaining the stack of digital systems – is far from ephemeral or virtual, but is deeply embodied in different activities. The scope is overwhelming: from indentured labour in mines for extracting the minerals that form the physical basis of information technologies, to the work of strictly controlled and sometimes dangerous hardware manufacturing and assembly processes in Chinese factories; from exploited outsourced cognitive workers in developing countries labelling AI training data sets, to the informal physical workers cleaning up toxic waste dumps. These processes create new accumulations of wealth and power, which are concentrated in a very thin social layer.


Current machine learning approaches are characterised by an aspiration to map the world, a full quantification of the visual, auditory, and recognition regimes of reality. From a cosmological model of the universe to the world of human emotions as interpreted through the tiniest muscle movements in the human face, everything becomes an object of quantification.


Jean-François Lyotard introduced the phrase “affinity to infinity” to describe how contemporary art, techno-science and capitalism share the same aspiration to push boundaries towards a potentially infinite horizon. The second half of the 19th century, with its focus on the construction of infrastructure and the uneven transition to an industrialised society, generated enormous wealth for the small number of industrial magnates that monopolised the exploitation of natural resources and production processes. The new infinite horizon is data extraction, machine learning, and reorganising information through AI systems of combined human and machine processing.


Such unrestrained thirst for new resources and fields of cognitive exploitation has driven a search for ever deeper layers of data that can be used to quantify the human psyche, conscious and unconscious, private and public, idiosyncratic and general. In this way, we have seen the emergence of multiple cognitive economies, such as the attention economy, the surveillance economy, the reputation economy, and the emotion economy, as well as the quantification and commodification of trust and evidence through cryptocurrencies.


“The ‘enclosure’ of biodiversity and knowledge is the final step in a series of enclosures that began with the rise of colonialism,” Vandana Shiva explains. In Shiva’s words, “the destruction of the commons was essential for the Industrial Revolution, to provide a supply of natural resources for raw material to industry. The commons, therefore, had to be privatised, and people’s sustenance base in these commons had to be appropriated, to feed the engine of industrial progress and capital accumulation.” While Shiva is referring to the enclosure of nature by intellectual property rights, the same process is now occurring with machine learning – an intensification of quantified nature. The new gold rush in the context of AI is to enclose different fields of human knowing, feeling, and action, in order to capture and privatise those fields.


Increasingly, the process of quantification is reaching into the human affective, cognitive, and physical worlds. Training sets exist for the detection of emotion, for family resemblance, for tracking an individual as they age, and for human actions like sitting down, waving, raising a glass, or crying. Every form of biodata – including forensic, biometric, sociometric, and psychometric – is being captured and logged into databases for AI training. The training sets for AI systems claim to be reaching into the fine-grained nature of everyday life, but they repeat the most stereotypical and restricted social patterns, re-inscribing a normative vision of the human past and projecting it into the human future.


Drawing out the connections between resources, labour and data extraction brings us inevitably back to traditional frameworks of exploitation. These processes create new accumulations of wealth and power, which are concentrated in a tiny sliver of society dominated by a few global mega-corporations, which are creating new infrastructures and mechanisms for the accumulation of capital and exploitation of human and planetary resources.