The epistemic coup proceeds in four stages.
The first is the appropriation of epistemic rights, which lays the foundation for all that follows. Surveillance capitalism originates in the discovery that companies can stake a claim to people’s lives as free raw material for the extraction of behavioral data, which they then declare their private property.
The second stage is marked by a sharp rise in epistemic inequality, defined as the difference between what I can know and what can be known about me. The third stage, which we are living through now, introduces epistemic chaos caused by the profit-driven algorithmic amplification, dissemination and microtargeting of corrupt information, much of it produced by coordinated schemes of disinformation. Its effects are felt in the real world, where they splinter shared reality, poison social discourse, paralyze democratic politics and sometimes instigate violence and death.
In the fourth stage, epistemic dominance is institutionalized, overriding democratic governance with computational governance by private surveillance capital. The machines know, and the systems decide, directed and sustained by the illegitimate authority and anti-democratic power of private surveillance capital. Each stage builds on the last. Epistemic chaos prepares the ground for epistemic dominance by weakening democratic society — all too plain in the insurrection at the U.S. Capitol.
To understand the economics of epistemic chaos, it’s important to know that surveillance capitalism’s operations have no formal interest in facts. All data is welcomed as equivalent, though not all of it is equal. Extraction operations proceed with the discipline of the Cyclops, voraciously consuming everything it can see and radically indifferent to meaning, facts and truth.
In a leaked memo, a Facebook executive, Andrew Bosworth, describes this willful disregard for truth and meaning: “We connect people. That can be good if they make it positive. Maybe someone finds love. … That can be bad if they make it negative. … Maybe someone dies in a terrorist attack. … The ugly truth is … anything that allows us to connect more people more often is *de facto* good.”
In other words, asking a surveillance extractor to reject content is like asking a coal-mining operation to discard containers of coal because it’s too dirty. This is why content moderation is a last resort, a public-relations operation in the spirit of ExxonMobil’s social responsibility messaging. In Facebook’s case, data triage is undertaken either to minimize the risk of user withdrawal or to avoid political sanctions. Both aim to increase rather than diminish data flows. The extraction imperative combined with radical indifference to produce systems that ceaselessly escalate the scale of engagement but don’t care what engages you.
Principles for the Third Decade
Let’s begin with a thought experiment: Imagine a 20th century with no federal laws to regulate child labor or assert standards for workers’ wages, hours and safety; no workers’ rights to join a union, strike or bargain collectively; no consumer rights; and no governmental institutions to oversee laws and policies intended to make the industrial century safe for democracy. Instead, each company was left to decide for itself what rights it would recognize, what policies and practices it would employ and how its profits would be distributed. Fortunately, those rights, laws and institutions did exist, invented by people over decades across the world’s democracies. As important as those extraordinary inventions remain, they do not protect us from the epistemic coup and its anti-democratic effects.
The deficit reflects a larger pattern: The United States and the world’s other liberal democracies have thus far failed to construct a coherent political vision of a digital century that advances democratic values, principles and government. While the Chinese have designed and deployed digital technologies to advance their system of authoritarian rule, the West has remained compromised and ambivalent.
Unprecedented harms demand unprecedented solutions
Just as new conditions of life reveal the need for new rights, the harms of the epistemic coup require purpose-built solutions. This is how law evolves, growing and adapting from one era to the next.
When it comes to the new conditions imposed by surveillance capitalism, most discussions about law and regulation focus downstream on arguments about data, including its privacy, accessibility, transparency and portability, or on schemes to buy our acquiescence with (minimal) payments for data. Downstream is where we argue about content moderation and filter bubbles, where lawmakers and citizens stamp their feet at recalcitrant executives.
Downstream is where the companies want us to be, so consumed in the details of the property contract that we forget the real issue, which is that their property claim itself is illegitimate.
What unprecedented solutions can address the unprecedented harms of the epistemic coup? First, we go upstream to supply, and we end the data collection operations of commercial surveillance. Upstream, the license to steal works its relentless miracles, employing surveillance strategies to spin the straw of human experience — my fear, their breakfast conversation, your walk in the park — into the gold of proprietary data supplies. We need legal frameworks that interrupt and outlaw the massive-scale extraction of human experience. Laws that stop data collection would end surveillance capitalism’s illegitimate supply chains. The algorithms that recommend, microtarget and manipulate, and the millions of behavioral predictions pushed out by the second cannot exist without the trillions of data points fed to them each day.
Next, we need laws that tie data collection to fundamental rights and data use to public service, addressing the genuine needs of people and communities. Data is no longer the means of information warfare waged on the innocent.
Third, we disrupt the financial incentives that reward surveillance economics. We can prohibit commercial practices that exert demand for rapacious data collection. Democratic societies have outlawed markets that trade in human organs and babies. Markets that trade in human beings were outlawed, even when they supported whole economies.
These principles are already shaping democratic action. The Federal Trade Commission initiated a study of social media and video-streaming companies less than a week after filing its case against Facebook and said it intended to “lift the hood” of internal operations “to carefully study their engines.” A statement by three commissioners took aim at tech companies “capable of surveilling and monetizing … our personal lives,” adding that “too much about the industry remains dangerously opaque.”
Groundbreaking legislative proposals in the European Union and Britain will, if passed, begin to institutionalize the three principles. The E.U. framework would assert democratic governance over the largest platforms’ black boxes of internal operations, including comprehensive audit and enforcement authority. Fundamental rights and the rule of law would no longer vaporize at the cyberborder, as lawmakers insist on “a safe, predictable, and trusted online environment.” In Britain the Online Harms Bill would establish a legal “duty of care” that would hold the tech companies responsible for public harms and include broad new authorities and enforcement powers.
Two sentences often attributed to Justice Brandeis feature in the congressional subcommittee’s impressive antitrust report. “We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we cannot have both.” The statement so relevant to Brandeis’s time remains a pungent commentary on the old capitalism we know, but it ignores the new capitalism that knows us. Unless democracy revokes the license to steal and challenges the fundamental economics and operations of commercial surveillance, the epistemic coup will weaken and eventually transform democracy itself. We must make our choice. We may have democracy, or we may have surveillance society, but we cannot have both. We have a democratic information civilization to build, and there is no time to waste.