How to Treat the Data Economy Like a Utility

“Data utilities” should be established across the U.S., creating universal access to public data.

Yakov Feygin

Yakov Feygin is the associate director of the Future of Capitalism program at the Berggruen Institute.

(Getty Images)

The rise of the “data economy” rightly makes people nervous. From political propaganda to scandals involving personal data, companies like Google, Apple, Facebook and Amazon have not exactly covered themselves with glory. Politicians and others have grown more and more concerned about how the data economy is becoming monopolistic and exploiting workers, and several Democratic presidential candidates have proposed using the tools of monopoly regulation to break up the Silicon Valley giants. These efforts are commendable but ironically may not be ambitious enough.

Regulating the digital economy will mean not only assuring competition, but also actively incentivizing ways it serves the common good. New data-intensive firms have tremendous potential to transform the economy. But so far, their effect has been negligible. American productivity is slowing and whatever benefits productivity brings has not returned to workers. To fully realize the potential benefits that this industry may bring, we need to treat it like what it most resembles — a utility. The utility approach to regulating data firms opens a vista of possibilities to create a new economic ecosystem that serves the public good while assuring economic growth.

At first glance, the challenges of regulating data-intensive industries might seem unprecedented and daunting. Data, unlike many other commodities, has huge network effects and returns to scale. Our individual information is valuable but not as valuable as the aggregated information of many users. “Big data” allows developers to sharpen predictive and analytic tools and could eventually be the fuel upon which giant leaps in artificial intelligence is developed.

There are harms that come with this segmented platform ecosystem. First, developers are reliant on one or another platform for inputs restricting the interoperability of systems. As a result, large platform companies can then either acquire or compete with their customers. This is common enough that developers have termed it “the kill zone.” Indeed, despite tales of Silicon Valley entrepreneurship, new business creation is down and many technology startups are sold to major firms.

“To fully realize the potential benefits that the data economy may bring, we need to treat it like what it most resembles — a utility.”

While these arrangements are good for some individual entrepreneurs who are well compensated for selling their firms, it is bad for competition and the larger economy. This might be the answer to the infamous “productivity puzzle” — the question of why, despite massive technological advances, productivity has slowed. Instead of diffusing productive advances across industries, platforms with large network effects force other firms to rely on their standards and environments to deploy technology.

We are faced with a dilemma. On one hand, big data requires scale and uniformity that fits a large network better than many small silos. However, this same effect is also responsible for an inequitable, politically opaque and ultimately inefficient structure for a vital new industry. The good news is that we’ve been here before, and we have the tools to solve these issues. The creation of utilities, or publicly authorized and specially regulated monopolies, is a cornerstone of American anti-trust law seen by the original progressives as a powerful tool with which to combat rent-seeking but “natural” monopolies.

The progressive era also offers us an example of a general-purpose technology that mirrors data and AI: electricity. Despite the development of the steam turbine in the late 19th century, it took decades for the impact of the technology to be felt. Firms either had to have the capital to install their own turbines in their factories, or they had to rely on transmission lines being established by unregulated monopolies. Starting at the municipal and state levels, political leaders and constituencies established public utilities for electrical transmission.

We also have a model for how the American government successfully shaped an emerging industry by creating and directing an innovative ecosystem. During the Cold War, the Defense Advanced Research Projects Agency established missions for private industries to collaborate on. Using its role as funder and customer, DARPA coordinated these collaborations and made sure that firms participating in these operations relied on each other to form innovative ecosystems in which knowledge was shared rather than hoarded. In fact, DARPA contracts explicitly forced competing contractors to share proprietary information with one another.

“The good news is that we’ve been here before, and we have the tools to solve these issues.”

The experience of utility regulation and DARPA provide us with a blueprint to tackle a monopoly in the new economy in a way that benefits multiple stakeholders rather than entrenched interests. Public data banks — “data utilities” — should be established by localities, states and the federal government. The goals of these data banks should be to establish a standard for public data, integrate private data with public data to create universal access to all firms and to create an ecosystem in which private companies are dependent on the public data bank and each other to innovate. Such agencies should aggressively solicit the transfer of private data to public uses using contracts and tax incentives.

Fortunately, there are several discussions in a wide variety of municipalities that may one day evolve toward such a data utility. The city of Toronto and a Google subsidiary called Sidewalk Labs are collaborating to solve the problem of data access and governance via the creation of an “independent civic data trust,” which would control the data. There are many ethical and governance questions that arise from this arrangement, but the point is that there is a legal architecture in place to solve them.

However, a trust alone is not enough to steer the new economy in the way a utility could. In California, following a call by Governor Gavin Newsom for a “data dividend,” some theorists have begun to think of a “dividend” in the broad context of a public good. One strategy would be to set up a data tax to fund a “Data Relations Board” that would build on existing legislation to build a public data set. The DRB would be able to use tax breaks and other incentives to encourage firms to transfer proprietary data sets into the DRB’s custody for wide use. By establishing a public infrastructure, this new agency would be able to guide the development of the data economy toward a larger public good.

Public data utilities are a vital policy for reshaping the economy to serve the public good, as well as to restart stagnant economic growth. Since the collapse of the dot-com bubble, the rate of productivity growth — the measure of economic and technical progress — has stalled in most industrialized economies, despite the fact that new technology has developed at an exponential scale. Public data utilities could help unblock the bottlenecks that have been limiting the wide-scale deployment of these technologies and help construct an economy in which members of the public have a chance to benefit from the fruits of increased productivity.


composed by Arswain
machine learning consultation by Anna Tskhovrebov
commissioned by the Berggruen Institute
premiered at the Bradbury Building
downtown Los Angeles
april 22, 2022

Human perception of what sounds “beautiful” is necessarily biased and exclusive. If we are to truly expand our hearing apparatus, and thus our notion of beauty, we must not only shed preconceived sonic associations but also invite creative participation from beings non-human and non-living. We must also begin to cede creative control away from ourselves and toward such beings by encouraging them to exercise their own standards of beauty and collaborate with each other.

Movement I: Alarm Call
‘Alarm Call’ is a long-form composition and sound collage that juxtaposes, combines, and manipulates alarm calls from various human, non-human, and non-living beings. Evolutionary biologists understand the alarm call to be an altruistic behavior between species, who, by warning others of danger, place themselves by instinct in a broader system of belonging. The piece poses the question: how might we hear better to broaden and enhance our sense of belonging in the universe? Might we behave more altruistically if we better heed the calls of – and call out to – non-human beings?

Using granular synthesis, biofeedback, and algorithmic modulation, I fold the human alarm call – the siren – into non-human alarm calls, generating novel “inter-being” sonic collaborations with increasing sophistication and complexity. 

Movement II: A.I.-Truism
A synthesizer piece co-written with an AI in the style of Vangelis’s Blade Runner score, to pay homage to the space of the Bradbury Building.

Movement III: Alarmism
A machine learning model “learns” A.I.Truism and recreates Alarm Call, generating an original fusion of the two.

Movement IV: A.I. Call
A machine learning model “learns” Alarm Call and recreates A.I.Truism, generating an original fusion of the two.