Vision

What Would Life Become?

The search for extraterrestrial biosignatures, animal (and the possibility of human) cloning, human/machine mashups (cyborgs), and dreams to facilitate reproduction beyond Earth are future-facing technologies that have further complicated purported thresholds, conditions, and boundaries of “the human” and “life”—as if such categories have ever been stable. The What Will Life Become? Workshop explores how scientists, theorists, and artists can design and co-shape the futures of life, the mind, and the planet.

In concert with the Berggruen Institute’s newly launched Future Humans program, the Workshop’s participants will share their work and speculations on how anticipated directions of “the human” — a subject in an increasingly interconnected world of new natures and new technologies — will intervene on social, political, and philosophical realms.

We will address the overarching question, “What Will Life Become?” through various methods to stimulate interdisciplinary, and perhaps unexpected, insights. The Public Forum and three Panels (“Life Forms Beyond the Human,” “Futures of Life in Outer Space,” and “Artificial Intelligence and the Futures of the Mind”) put into conversation scholars (philosophers, historians, anthropologists) and experimentalists (cosmologists, biotechnologists, AI practitioners) to ask, How do theory and practice inform future directions of liveliness and being? The Keynote (USC) and Speculative Worldmaking session (BI) invite creative interventions by which we might imagine and practice novel and inclusive ways how humans design and co-shape futures of liveliness and cognition.

Questions we will ask and answer in the Workshop include:

How will scientists reform expectations of life and personhood in a post-biological world?

How will the extra-planetary recapitulate or stage new social relations, institutions, and politics of Earth?

How do novel human/non-human agents (animals, robots) disrupt andro- and anthropocentric hierarchies of species and mind?

How will concepts of indigeneity, race, and ethnicity be shuttled to worlds beyond Earth and times beyond our present?


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.


RAVE (IRCAM 2021) https://github.com/acids-ircam/RAVE