Berggruen Institute Hosts What Will Life Become? Conference on Technology and the Human

Claire Webb

“We have the ability to self-modify through gene editing, AI, and other technological means. These are opportunities but also challenges that will change our nature. Where are we going as we expand and explore who we might become as a species?”

—Nicolas Berggruen

A self-aware artificial intelligence, algorithms that can be trained to “learn,” projectscthat plan for life beyond Earth, and biotechnologies that push the limits of life are among recent and anticipated technoscientific developments that have conjured new futures and unsettled theories of body, mind, and species in recent decades. They pose the question, What Will Life Become?

A two-day workshop this past April in collaboration with the USC Dornsife Center on Science, Technology, and Public Life, What Will Life Become? brought together path-breaking scholars, scientists, and artists to grapple with forms of life that might emerge in the mid-20th century. Together we asked: How can we tug together our present moment and an aspirational future by creating the conditions for the equitable coexistence of humans and our “Others”— a category that encompasses animals, thinking machines, and possible lively beings beyond the Earth?

The workshop opened with speculative architect Liam Young’s performance, Planet City and the Return of Global Wilderness, a science fiction safari set in a near-distant future where humanity has radically reversed its current pattern of planetary sprawl. In this imaginary world, 10 billion people fête the rewilding of planet Earth.

In a Forum moderated by Claire Isabel Webb, 2021-2022 Berggruen Fellow, and in three panels — Futures of Life, Futures of Mind, and Futures in Outer Space — scientists and philosophers explored how technologies like CRISPR, machine learning, and even an artificial womb designed for interstellar journeys are reshaping expectations of life and personhood. Kenric Allado-McDowell composed a poem with GPT3, a conversational AI model, reading, “Futures appear in bodies etched by stories encoding the will that learned through pain and invention. The future is seen by memory.” Lynn Rothschild, a NASA astrobiologist, sketched ways that scientists are using synthetic biology to imagine weird lifeforms beyond Earth. Meredith Whitaker of the AI Now Institute and Benjamin Bratton of UCSD debated the promises and perils of AI. These conversations provoked novel modes to imagine how ascendent non-human entities such as mycelial communities, robots, and neural nets disrupt hierarchies of species and mind — and how those entities will collaborate with humans to form novel social relations, institutions, and politics.

The Berggruen Institute commissioned three major new artworks for WWLB? and invited 100 guests to experience these installations at our headquarters, the iconic Bradbury Building. Sougwen Chung’s Ecologies of Becoming-With debuted a collaborative drawing performance with a multi-robotic system using spatialized sound and biofeedback. Nancy Baker Cahill’s towering AR figure, CORPUS, digitally inhabited the building’s entire atrium; the “symborg” evokes a future of blended, embodied entanglement between human, machine, and microbiome. REEPS100 showcased how he trains with AI to modify his voice. “The voice is the body and the mind,” he explained. “How I’ve trained my voice through collaboration with machines simultaneously pushes thresholds beyond what people had thought of possible and reveal what was already there.” REEPS100 revealed a turquoise and magenta studded “voice gem” — a digital visualization of Nicolas Berggruen’s voice.

In a breakout activity created by Sr. Vice President of Programs Nils Gilman with help from the BI Fellows, Embodied Futures, invited participants to leverage the afternoon’s artistic provocations to envision possible worlds in 2049. Small groups debated how humans might use animal language translation tools, the ethics of space settlements on Mars, and the concept of collective agency between species. In our final activity, an epistolary exercise to a lively entity in the hoped-for future,  we invited participants to populate those imagined worlds. In one Letter to 2049, a participant wrote to her children, “I hope you are still lively — that is, part human. I figure you will be ‘adjusted’ by technology, gene development, and other advances. AI might have become your companion. Despite all that, please read fiction, touch the grass, kiss the head of your sleeping baby, and treasure the bonds of family.” The Berggruen Institute plans to send these letters to their writers in 2049.

Stay tuned for future Berggruen Institute events around WWLB?, including a beautiful catalogue of the event, digital ways to participate in the futures activities, taped interviews with the three artists, and public programming around the themes of futures of life, mind, and outer space.


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.


RAVE (IRCAM 2021) https://github.com/acids-ircam/RAVE