Collaboration:

Veronica Samotskaya – ornithologist, scientist-biologist, science popularizer, journalist;

Natalia Soboleva – AI expert;

Konstantin Yakovlev – PhD in Physics and Mathematics, AI expert;

Nikita Prudnikov – musician, developer, AI expert.

The project was supported by Garage Museum of Contemporary Art (Moscow).

Bird Language is a project exploring the possibilities of artificial intelligence within the context of bio-semiotics.

The project is inspired by the ideas of Noam Chomsky about innate linguistic structures – universal grammar – typical for both human language and sign systems of animals and birds. Findings in machine learning demonstrate that AI captures this universal grammar by statistically extracting patterns of language. In the case of the birds’ language, machine learning can distinguish peculiar “bird phonemes”. This AI looks for patterns within bird sounds to build a mathematical model of the Universal Grammar of Bird Language.

In the first stage of the project, we trained a neural network on the sounds of nightingales to create communication between non-human agents: birds and AI. This is an exploration of communication between nature and technology in which a human being is not necessary.

The second stage of the project is the creation of an AI translator from bird language to human language.The bird we started to work with was the Great Tit, one of the most widespread species from Europe to Asia. The first machine learning approach we used was XGBoost, a boosted decision tree algorithm. While this model classified bird signs into two groups, calls and songs, with an accuracy of 83%, it was not a generalizable tool for understanding the structure of bird language.

We are now developing the second machine learning approach, based on auto-encoders. First, we take the waveforms from the audio of the bird language and process it with an auto-encoder, in order to extract the shape of bird phonemes. Once we have these shapes, we then take a second auto-encoder, and feed it with the shapes of the phonemes, allowing us to see clusters of bird phonemes. These clusters reveal a latent language structure within bird sounds, letting us deconstruct the bird language into categories of phonemes. We use this to build an AI translator for interspecies communication.

Through the auto-encoder, we created a universal tool to not only understand the structure of bird language, but also to translate and generate bird signs (a form of text-to-speech). This tool can be used as a base for many types of research, such as citizen science research of bird communication or the influence of urban environment on bird signals.

The motivation for the project was the paradigm of non-anthropocentric world, where we look to understand non-human others, such as animals and birds. The inspiration came from Jakob Johann von Uexküll’s concept of Umwelt – the experiential world that a being inhabits along with its perceptual surrounds.

Possessing non-deterministic fuzzy logic, an artificial intelligence as a non-human agent can help us to understand other non-human agents like birds and animals. In this case, AI is not only a mediator or interface between human beings and birds, but rather an organ or full partner. Semiotically active, it helps us understand the subjectivity of birds through language previously inaccessible.

Helena Nikonole is a new media artist, independent curator and educator, based in Istanbul, whose field of interests embraces hybrid art, bio-semiotics and Artificial Intelligence. According to Helena Nikonole, an artistic exploration of the potential possibilities (and also potential risks) of technologies is necessary to understand the context of the modern technologically and media-determined world.

She presented lectures and workshops in the field of Art & Science and Neural Networks in arts at different institutions including Rodchenko Art School (Moscow), Art Laboratory (Berlin), Mutek Festival (Montreal and Tokyo), ITMO University and many others.

Exhibitions and festivals include “Out of the Box” at Ars Electronica 2019, AI Music Festival, “Persisting Realities” in the frame of CTM Festival (Kunstraum Kreuzberg, Berlin, Germany), “Open Codes” ZKM Center for Art and Media in Karlsruhe, Germany, «Contagious Algorithms» at Drugo More (Rijeka, Croatia), IAM at GARAGE Museum Moscow, “YouFab Creative Award 2019 Winners Exhibition”, SHIBUYA QWS, Tokyo, Japan etc.

As a curator, she focuses on critical approaches to technology. This includes the exhibition “Learning Machines” which she co-curated together with artist and curator Alexey Shulgin (2019), «Art Code» show (GROUND Solyanka Gallery, Moscow, 2021), “Datasets vs Mindsets: Post-Soviet Explorations of the Digital Control Society” (co-curated together with Olga Vad, 2020) and “Uncanny Dream” exhibition in a context of Ars Electronica Festival 2021 (together with Oxana Chvyakina).