Harpreet Sareen

Harpreet Sareen

Media Designer and Artist, 2022-2023 Berggruen Fellow

Biography

Harpreet Sareen is an Assistant Professor of Interaction and Media Design and Director of Synthetic Ecosystems Lab at The New School. His research is situated at the intersection of Material Science, Biology, and Electronics and draws on the complementary abilities of the biological and artificial worlds. He terms this as ‘Convergent Design’ and creates bionic materials and hybrid substrates that lend themselves for future ecological machinery, sensing systems, and interaction design. Harpreet has previously created robots driven by physiological signals of plants, nanosensors inside plant leaves for water monitoring, and grown electronic nanowires inside plant stems.

His experience spans corporate research wings, studios, and museums to academic centers having previously worked at Ars Electronica Museum, Google Creative Lab, Microsoft Research, National University of Singapore, Keio University, Telecom Paris, and The University of Tokyo. He has previously been named as MIT Technology Review Under 35 Innovator and awarded CHI Golden Mouse, Edison Award – Gold, SXSW Interactive Innovation Award, and Fast Company World Changing Ideas. At Berggruen, Harpreet will be creating installations that refocus the human attention on underwater macroalgae through recording of their physiological phenomena in various coasts of the world.


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.


RAVE (IRCAM 2021) https://github.com/acids-ircam/RAVE