Permanent and temporary commissions and projects

The Next Biennial Should be Curated by a Machine: Experiment AI-TNB

The Next Biennial Should be Curated by a Machine is an inquiry into the relationship between curating and Artificial Intelligence (AI). The project unfolds as a series of experiments exploring the application of machine learning techniques (a subset of AI) to curating large scale contemporary art exhibitions, to reimagine curating as a self-learning human-machine system. Making reference to the e-flux 2013 project ‘The Next Documenta Should Be Curated by an Artist’ – which questioned the structures of the art world and the privileged position of curators within it – the project extends this questioning to AI. It asks how AI might offer new alien perspectives on conventional curatorial practices and curatorial knowledge. What would the next Biennial, or any large scale exhibition, look like if AI machines were asked to take over the curatorial process and make sense of a vast amount of art world data that far exceeds the capacity of the human curator alone?

Experiment AI-TNB, the second in the series, takes Liverpool Biennial 2021 edition as a case study to explore machine curation and visitor interaction with artworks already selected for the Biennial by its curator. It uses data from the biennial exhibition as its source – the photographic documentation of artworks, their titles and descriptions – and applies machine learning to generate new interpretations and connections. At its heart is OpenAI’s ‘deep learning’ model CLIP, released in 2021, which is able to compare the similarity between an image and a short text.

On the project’s landing page, visitors encounter fifty eerie images – some of which look like photographs, others like drawings or collages. These are images generated by AI in response to the titles of the source artworks, using technique CLIP to guide a GAN (Generative Adversarial Network) into creating an image that ‘looks like’ a particular text. Navigating through the experiment, visitors are presented with a triptych of images and texts, with the source artwork placed in the centre, AI-generated image on the left and a heatmap overlaid on the source image on the right. ‘Deep learning’ models are used to create new links between the visual and textual material, as well as entirely new images and texts. Every page is also a trifurcation: visitors can explore the links between the original source and generated material, word and image, art and data. As visitors navigate the project, they create their own paths through the material, each journey becoming a co-curated human-machine iteration of the Biennial saved to the project’s public repository (Co-curated Biennials).

Experiment AI-TNB is funded by the UKRI’s Arts Humanities Research Council program ‘Towards a National Collection’ under grant AH/V015478/1.

Credits: Series curator Joasia Krysa; Series technical concept Leonardo Impett; Experiment machine learning concept and implementation Eva Cetinic; Web development and design MetaObjects (Ashley Lee Wong and Andrew Crowe) and Sui.

Enter project here

For more details about the project: How It Works

Find out about Experiment 1: B³(NSCAM), developed by Ubermorgen, Leonardo Impett, Joasia Krysa, 2021, Liverpool Biennial and The Whitney Museum of American Art

For a larger discussion on AI and curating visit Liverpool Biennial’s journal Stages vol 9/2021

ai.biennial.com