Yann LeCun

Yann LeCun

Director of AI Research, Facebook

Biography

Yann LeCun is Chief AI Scientist for Facebook AI Research (FAIR), where he joined Facebook in December 2013. He is also a Silver Professor at New York University on a part time basis, mainly affiliated with the NYU Center for Data Science, and the Courant Institute of Mathematical Science. He received his EE Diploma from Ecole Supérieure d’Ingénieurs en Electrotechnique et Electronique (ESIEE Paris), and a PhD in CS from Université Pierre et Marie Curie (Paris). After a postdoc at the University of Toronto, he joined AT&T Bell Laboratories in Holmdel, NJ. He became head of the Image Processing Research Department at AT&T Labs-Research in 1996, and joined NYU as a professor in 2003, after a brief period as a Fellow of the NEC Research Institute in Princeton. From 2012 to 2014 he was the founding director of the NYU Center for Data Science. He is the co-director of the Neural Computation and Adaptive Perception Program of CIFAR, and co-lead of the Moore-Sloan Data Science Environments for NYU. He received the 2014 IEEE Neural Network Pioneer Award.


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.


RAVE (IRCAM 2021) https://github.com/acids-ircam/RAVE