Berggruen Seminar 23: Psychological, Brain and Virtual Reality Research on Moral Emotion and Cognition

September 20, 2023

3:30am

Time:
3:30 – 5am PST | 6:30 – 8:00 BST

Venue:
PKU Bookstore
(B2, New Sun Student Center)
Open only to students and faculty within PKU

Language: Chinese

Live stream available: Click here

About the event:
Moral issues and related social phenomena have always attracted public attention. This issue has long been a hot cross-disciplinary research area in philosophy, sociology, psychology, and other disciplines. However, we currently do not have a good answer to the key scientific question “How do people process morality?” We combined different research methods such as behavior, brain imaging, and virtual reality technology to conduct a series of studies on the psychological and brain mechanisms of moral emotions and cognition from the perspectives of processing differences and impact intervention methods of moral emotions, as well as gender differences in courageous actions. We hope that an in-depth understanding of moral emotions and cognition will help us better understand interpersonal interactions and social moral relationships.

  • What are the corresponding neural mechanisms of moral behaviors such as interpersonal deception and cooperation?
  • How are moral emotions such as guilt, shame, and gratitude represented in the brain?
  • How to use virtual reality technology to study human social and ethical behavior?

Speaker:
LIU Chao
Professor at the Faculty of Psychology, State Key Laboratory of Cognitive Neuroscience and Learning & IDG / McGovern Institute for Brain Research, Beijing Normal University
2022-2023 Berggruen Fellow

Professor Liu’s research is in emotion and social cognition in neuroscience. He focuses on the regulatory role and brain mechanisms of emotions in social cognition, especially moral cognition, as well as its applications in fields such as education, management, and public security. He has published more than 40 corresponding author papers in well-known international journals such as Cerebral Cortex, Neuroimage, and Social Cognitive & Affective Neuroscience. He was the PI of two major National Social Science Projects: “Interdisciplinary Research on Psychology, Brain and Artificial Intelligence of Chinese People’s Moral Cognition and Emotional Characteristics” and “Characteristics of Chinese People’s Social Cognition: Integrated Research on Psychological and Brain Sciences”.

Moderator:
KUAI Shuguang

Professor at the School of Psychology and Cognitive Science, East China Normal University, Vice Director of the Engineering Psychology Professional Committee, Chinese Psychological Society

Professor Kuai’s research utilizes virtual reality, neuroimaging, and computational models to explore human social interactions and human-computer interactions, with a particular focus on human behavior in virtual environments. He has published numerous papers as the first or corresponding author in internationally renowned scientific journals, including Nature Neuroscience, Nature Human Behavior, and Nature Machine Intelligence. Professor Kuai has served as the Principal Investigator (PI) for multiple grants awarded by the National Natural Science Foundation of China, including the National Science Fund for Excellent Young Scholars.


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.


RAVE (IRCAM 2021) https://github.com/acids-ircam/RAVE