Is Technology a ‘Smart’ Drug?

Sponsored by the Berggruen Research Center, Peking University, and OWSpace Foundation, two events were held in Beijing’s OWSpace bookstore on September 4: the lecture “Is Technology a ‘Smart’ Drug?” as well as the sharing session for the book Technology Is Sick, But I Have No Cure《技术有病,我没药》. About 60 people participated in the event in person, while as many as 15,000 viewers watched online via livestream on Bilibili, a Chinese video sharing platform.

The lecture focused on two primary questions: In our technological age, how should people live their lives, and how should technology developers conceive of and plan for the future? It is possible that the time wherein people can choose whether to enjoy the fruits of technologies or abandon the convenience they bring has already passed. From mobile phones to the Five-Flavored Tea of Forgetfulness, from lotus fruits to sex robots, from rejection of technology to human enhancement, from Martin Heidegger to Bernard Stiegler, Technology Is Sick, But I Have No Cure takes on the most challenging questions of our age. Authored by four philosophy of technology researchers — Yang Qingfeng, Yan Hongxiu, Duan Weiwen, and Liu Yongmou — the book sheds light on the debate surrounding whether technology is a poison or a cure for society.

The event was chaired by Li Xiaojiao, Associate Director of the Berggruen China Center. Two of the book’s authors — Professor Liu Yongmou, a doctoral tutor at the School of Philosophy, Renmin University of China, and Duan Weiwen, a 2020-2021 Berggruen Fellow and research fellow at the Institute of Philosophy, Chinese Academy of Social Sciences (CASS) — answered audience questions, made keynote remarks and had a discussion guided by eight key practical questions posed in the book.

1. Technology Is Rebellious: Can Humans Guide It Toward Goodness?

Liu Yongmou said at the sharing session that the era we are living in is an age of technology rather than an age of science. In ancient times, science and technology were separate fields. Science referred to intellectual traditions shared by aristocrats, while technology was merely considered “diabolic tricks and wicked crafts.” Back then, the latter couldn’t appeal to refined tastes. It wasn’t until the second half of the 19th century that science and technology were really integrated since each required the other to enhance explanatory powers and humanity’s ability to affect reality.

Since then, the status of technology has continued to improve and has even started to challenge the dominant position of science. Practical social applications of technology-enabled it to successfully resist the chauvinism of science and, ultimately, to invert the relationship between science and technology, fostering the birth of “technoscience.” A “rebellion of oppressed knowledge” has occurred. Humankind has since officially entered a new technological world. Now what?

Technology’s rationality lays the foundation for its future relevance. Technology has gradually penetrated both the human body and spirit. However, excessive obsession with technology puts humans in a state of exhaustion and unconscious passivity; technology even starts to weigh in on the “physical and mental design” of mankind. We can do nothing but act like Lucy — the first human who descended from the trees and no longer lived like the apes — and bid farewell to the concept of “technological supremacy.” Unfortunately, we have no idea how to appropriately coexist with technology.

Liu Yongmou also talked about the question of whether technology is controllable. He pointed out that the answer to this question should not be sought from technology itself, but lies in humankind’s own choices: First, do humans have the determination and courage to control the development of technology? Second, and more importantly, what prices or sacrifices are humans willing to pay for controlling technology? In a technologically-governed society, it is easy to tend toward sin (depravity), but difficult to seek goodness. Each of us can be part of the cure, but no one else can prescribe and take the medicine for you.

2. Schizophrenia in A Deeply Technologized Era

Duan Weiwen believes that overdependence on technology is turning our world into a compliance-based operating system, wherein obedience and docility becomes encouraged values. Technology will be a part of most decision-making processes and will exert its own decisive final consequences.

In such a world, both humankind and nature face the risk of being “liquidized” by technology. That means data disseminates individuals into tangible beings in the form of information flows, and human behaviors and preferences become suited for computing. Nature will be a floating commodity in the global trading system. Carbon accounting and carbon trading have made the eco-environment computable and tradable — without consideration of the fact that nature cannot simply be summed up like mathematics (to obtain the right to increase emissions in one place by continuously planting trees in another can only lead to ecological degradation). Besides, the world is facing the risk of being “vaporized:” With the advance of virtual reality technology and the metaverse, the real world where we live is gradually being replaced by a computable world, which can only leave us trapped in sensory stimuli, a digital nightmare.

We face the risk of being “domesticated” as we make technological advancements. We hesitated at the Sputnik moment, the Hiroshima moment, and the Dolly the Sheep moment. As we use technology to upgrade our own prospects and profoundly transform the earth, we are also overshadowed by problems such as food safety, ecological crises, and technology addictions, losing our basic skills in the wake of automation, and becoming “technological refugees.”

When platforms and capital partially monopolize technology, consequences may arise sooner than we think. Using ubiquitous monitoring technology, capitalism will use the “one-way glass” of data to track every penny you spend, every trip you take, and every micro-expression you show. This new form of capitalism sends promotional material to targeted customers, ultimately forming a dystopia in which people’s private lives are under panoramic and real-time monitoring. Boundaries may vanish. Data will become everyone’s “new skin,” and any substandard or forbidden words and behaviors will be recorded as erasable “scars” on our data skins.

Facing such a possible future, we must strengthen critical thinking, not only verbally, but we should also ponder whether we really need certain technologies — and when we have the right to say no and shut them down.

3. Status of the Philosophy of Science and Technology; Academic Integrity and Ethical Education in the Technological Age

After the sharing session, Liu Yongmou and Duan Weiwen answered several questions raised by online and offline audience members. Some of the questions were about the value and influence of the philosophy of science and technology, while others centered on exploring interactive relations between the philosophy of science and technology and society.

There are two frequently asked questions: Given that the golden age of philosophy has gone, what is the practical orientation of the philosophy of science and technology as an academic discipline in the face of strong capital and power, as well as fanatical techno-optimists and industrialists? Second, should philosophers of science and technology participate in social discussions; if so, how?

Liu Yongmou holds that technology has become a rhythm that cannot be neglected in this era, and understanding technology is a necessary prerequisite for understanding the spirit of the era. In this context, philosophy plays a role in clarifying relevant concepts and logic, setting the agenda, raising objections against widespread optimism, and helping societal actors figure out strategies to cope with possible challenges when the future remains uncertain. However, philosophy is not an elixir; just like other disciplines, it sails along its own path and is neither superior nor inferior to any others. We need philosophy to help us navigate between a utopia and a dystopia and between an ideal state and a machine-dominated state.

Duan Weiwen recounted his extensive discussions on the philosophy of science and technology with people from industrial, educational, and academic circles. He holds that in an age of information explosion and knowledge overflow, the key role that philosophy can play is no longer about metaphysics, but about connection. It integrates, sorts, analyzes, and criticizes knowledge produced by all disciplines and sectors. Philosophy then constructs logic and narratives that render possible future scenarios imaginable and traceable. The contemporary value of the philosophy of science and technology lies in that the fact that it helps humankind think over the future, participates in the shaping of a possible, friendlier future where everything has not yet been determined. Philosophy enables more people to gain independent awareness and rethink the future.

One participant asked how to guarantee academic integrity in the age of technology and how technological advances are likely to affect the scope of academic standards and misconduct.

Both keynote speakers believe that academic capitalism profoundly impacts academic outcomes. As we evaluate professors’ success based entirely on the number of papers and journal articles they publish, “academic” evaluation has veered off course. Reproducibility of scientific results from experiments is a matter of truth, which should never be manipulated. Fraud violates academic ethics, which may be caused by insufficient education of academic ethics and honesty among engineers and scientific researchers. Duan Weiwen warned that we should stay vigilant against overdependence on duplication checking software because plagiarism and fraud go beyond copying sentences and phrases: it is possible to duplicate viewpoints without copying words verbatim. Duplication checking software may be powerless to identify the copying of words that have been repeatedly translated and rephrased. What we should be truly concerned about is the theft of others’ ideas.

For the question of how to draw lessons from past experiences in technological governance, Liu Yongmou holds that the concept of “scientific supremacy” already began to prevail as early as the 18th and 19th centuries, and, accordingly, there had been a radical application of technology (such as excising the brain lobes of criminals). The difference between the past and the present is that in the past, technological governance was based on disciplines like chemistry, physics, and psychology, but now, intelligent technologies are guiding governance models. Integrating intelligent governance brought by massive amounts of data also increases the difficulty of control and restraint, which we need to innovate our thinking and attach enough importance.

For the question of whether technical colleges should introduce philosophical and humanistic scholars as key leaders, Duan Weiwen mentioned the courses about academic standards, scientific honesty, and technological ethics that he teaches at Southern University of Science and Technology. He believes that an ethical education for scholars with technical backgrounds will be a priority in the future. Only when those who control technology reflect on technology can human society form a better collective imagination of technology and ensure the light of philosophy is shone on our deeply technologized future.

4. Responses to The Book Title

The title of the book, Technology Is Sick, But I Have No Cure, was widely criticized on the basis that its claim is “a bit of a stretch.” The two scholars admitted that a “fancy and eye-catching title” is necessary at a time when human attention is scarce and information is overabundant. Only when readers are attracted to the title of a book can they learn something through reading it.

This also mirrors the teaching style of the two scholars — stepping out of the academic “ivory tower” to cool down the age of technology with their philosophical thinking in a reasonable way. Technology is both poison and cure, and perhaps each of us can offer a cure. No one else can take medicine for you.

Script by Li Zhilin, undergraduate student at the University of Hong Kong

composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.