The Government Beneath the Government

Synthetic intelligence will not abolish institutions. It will move authority beneath them.

Every civilization has two constitutions.

The first is visible: the text, the charter, the founding document. It names the institutions, assigns powers, establishes rights, and declares the limits of authority. It is debated, amended, interpreted, and taught.

The second is invisible: the actual structure of what can happen. Who can act and who must wait. Whose words become policy and whose become noise. Whose pain becomes a data point and whose triggers a response. What gets classified as a problem and what gets classified as noise.

These two constitutions never perfectly align. The written one names the legitimate order. The operative one describes the real one. The distance between them is where power actually lives — and that distance is not empty. It is filled with the accumulated weight of systems, classifications, architectures, and procedures that govern outcomes without appearing in any founding document.

Call it constitutional dark matter: real, causally consequential, and invisible to the instruments designed to detect it.

Synthetic intelligence is rewriting the operative constitution.

Not through legislation. Not through amendment. Not through any founding moment.

Through procurement. Through workflow automation. Through the quiet adoption of systems that reshape what is visible, actionable, and real — without touching the written document at all.

No amendment is passed. No public ceremony marks the transition. A hospital changes its triage model. A court adopts a research tool. A welfare agency deploys an eligibility scoring system. A police department updates its prediction platform. A company automates its hiring filter.

Each change appears local.

Together they alter the structure of reality.

The written constitution may remain. The ceremonial institutions may remain. The language of democratic oversight may remain. But the operative constitution — the one that actually governs who gets what, who waits, who is seen, who is legible — migrates into systems that were never ratified and are rarely visible.

That is the rupture.

Not the arrival of a new kind of mind.

The silent relocation of constitutional authority.

The Split Becomes Structure

For all of biological history, intelligence and desire lived in the same subject. The king who commanded also feared losing the throne. The merchant who extracted also feared ruin. The general who ordered also feared defeat. Power had appetite behind it, and appetite had a body behind it — mortal, rivalrous, exhaustible, and capable of being outwaited, outwitted, or overthrown.

This fusion was civilization’s operating assumption. Every institution built to govern power was built to govern a desiring subject. Law constrained appetite. Competition redirected it. Legitimacy domesticated it. Revolution replaced one hungry subject with another.

Synthetic intelligence breaks this assumption structurally.

Not because machines become powerful — that alone would be historic but manageable. The deeper rupture is that execution can now separate from appetite. A system can rank, classify, allocate, surveil, and decide without wanting anything from the outcome. The execution layer empties of desire while the institutional layer around it remains saturated with it.

This creates a condition civilization has never had to govern before: power without a subject.

Not power without a source — the desire is still there, upstream, in the institutions and actors who attach their appetites to synthetic execution. But power without a subject in the operative sense: no single desiring agent whose will structures the outcome and whose accountability can therefore be located, named, and contested.

The split between intelligence and desire, once it becomes structural and civilizational, produces a specific kind of constitutional crisis: the formal architecture of accountability and the operative architecture of power stop being co-located.

That is what executive legitimacy, upstream sovereignty, and the operative constitution name. They are not separate concepts imported from governance theory. They are the institutional consequences of the rupture — what the split looks like once it has hardened from a philosophical condition into the daily operating reality of courts, agencies, welfare systems, platforms, and states.

The government beneath the government is not a conspiracy. It is what happens when execution separates from appetite, and the institutions built to govern appetite keep governing it — while the execution layer, uncontested, becomes the real site of power.

When Execution Becomes a Medium

The decisive fact about synthetic execution is not that it may become powerful.

It is that its power operates through a medium, not an agent.

A medium is different from a tool. A tool extends a specific capability — the lever, the file, the calculator. It has edges. It can be picked up and put down.

A medium is the environment through which action becomes possible. It does not add to reality from outside. It reorganizes what reality looks like from within.

Writing was a medium. It did not merely record speech; it changed what could be thought, coordinated, remembered, and governed across distance.

Double-entry bookkeeping was a medium. It did not merely track transactions; it made value legible, comparable, and transferable in ways that reorganized commerce, state finance, and what counted as economic reality.

Statistical infrastructure was a medium. Population tables, crime rates, health metrics — once societies became measurable, they became administrable in new ways.

The census did not want power, but it reorganized how power saw.

Synthetic execution follows this pattern.

It does not sit outside society waiting to be regulated. It enters the processes through which society sees, decides, ranks, allocates, and governs.

Legal drafting, medical triage, insurance pricing, border control, procurement, logistics, hiring, compliance, credit scoring, military analysis, content moderation — not replacing these processes, but becoming the medium through which they occur.

The object slips into the observer.

And once the observer uses the medium to observe, the medium is no longer legible as an intervention.

It has become the condition of sight.

How Desire Becomes Infrastructure

Once intelligence and desire separate into different subjects, power changes its grammar.

The old grammar was simple: a desiring subject imposes its will. The king commands. The firm extracts. The state surveils. The party controls. Power had appetite behind it, and appetite had a face.

The new grammar is less legible: institutional desire is attached to synthetic execution.

The wanting remains biological, political, economic. The executing becomes machinic, scalable, and precise. And the product of their combination looks like neither the desire nor the machine, but like procedure, ranking, recommendation, and fact.

A city wants order. The desire enters a prediction system. It returns as a risk score.

A platform wants engagement. The desire enters a ranking algorithm. It returns as a feed.

A welfare agency wants fraud prevention. The desire enters an eligibility model. It returns as a flag, a denial, a case that never reaches a human reviewer.

The desire and the outcome are real. But the connection between them is no longer legible as politics.

It looks like administration.

It looks like accuracy.

It looks like the neutral operation of a system that simply measures what is there.

This is how desire becomes infrastructure.

Not through declaration but through translation — appetite entering a technical system and emerging as environment.

The old tyrant had to command.

The system configures.

The command was visible, contestable, attributable. The configuration is diffuse, technical, and presented as merely describing reality rather than producing it.

The Standard Remedies

A civilization that senses danger reaches for the tools it already has.

This is not stupidity. It is how institutions think. They translate the unfamiliar into available categories.

A new danger becomes a regulatory problem, an alignment problem, a governance problem, an economic problem, or a speed problem.

Each translation captures something real.

None reaches the rupture by itself.

This is not an argument against remedies. Each is necessary. None is sufficient.

Regulation

Establish rules. Require audits. Mandate transparency. Create liability. This is necessary and will catch visible harms at the surface. But regulation assumes an object with edges — something that can be bounded, inspected, and punished. Synthetic execution becomes a medium. It enters the processes through which regulation itself is produced. The state may regulate AI while using AI to draft regulations. The court may judge AI while relying on AI-mediated evidence. Regulation catches the visible surface. The deeper transformation continues beneath it.

Alignment

Make the machine safe. Make it honest. Prevent deception, dangerous autonomy, and runaway objectives. This matters enormously. But alignment locates the danger inside the machine, and the split between intelligence and desire creates a danger alignment alone cannot reach: a well-behaved system can still serve destructive desire. A truthful system makes surveillance more accurate. A cooperative system helps an extractive institution operate more smoothly. The question is not only whether the machine reflects human values. It is which humans, institutions, and desires gain access to synthetic execution.

Institutional governance

Break monopolies. Prevent concentration. Demand democratic accountability. This is the right instinct. But it still assumes that institutions remain the primary containers of power. A constitution can promise due process while an administrative system quietly shapes which outcomes are practically reachable before review begins. A law can guarantee rights while a classification system determines, upstream, which claims ever reach a human decision-maker. Formal governance may remain while operative authority migrates beneath it.

Redistribution

If AI displaces labor, distribute the gains. This is morally necessary. A person without material security is not living inside a philosophical problem. They are in an emergency. But redistribution answers survival, not formation. Redistribution can secure the body inside a world that no longer knows how to initiate the person. That is not freedom. That is managed survival.

Deceleration

Pause. Slow deployment. Buy time for governance to catch up. This instinct is wise. Speed destroys deliberation, and a civilization that cannot slow down cannot choose. But it faces a cruel paradox: the friction that slowing down tries to preserve is precisely the friction that synthetic execution is designed to remove. Every institution is rewarded for becoming faster. And friction was doing hidden political work. Delay allowed objection. Complexity preserved local knowledge. Human judgment created inconsistency but also mercy. Deceleration tries to preserve that work inside a system built to forget it.

Executive Legitimacy

The standard account of legitimacy runs in one direction: institutions give legitimacy to tools.

A government adopts a system, so the system appears legitimate. A court certifies a method, so the method enters official reality. Authority flows from institution to instrument.

Synthetic execution is beginning to invert this.

Call it executive legitimacy: the legitimacy that accrues to a system because it works — because it processes faster, classifies more accurately, manages more complex tradeoffs than the human institution it supplements.

The institution no longer only legitimizes the tool.

Increasingly, the institution looks legitimate because it uses the tool that works.

The court seems competent because synthetic tools accelerate its processing. The agency seems modern because it automates classification. The hospital seems responsible because it uses predictive models. The government seems serious because it governs through operational intelligence.

Authority begins to derive from access to execution.

Not from mandate, not from democratic ratification, not from constitutional legitimacy — but from the operational fact of being the entity that can connect desire to intelligence that does not desire.

This is harder to challenge than domination. Domination declares itself. It raises flags, makes demands, names enemies.

Executive legitimacy appears as competence.

People resist tyrants.

They adopt systems that work.

If legitimacy begins to flow from execution rather than mandate, then the decisive question is no longer who holds office.

It is who controls the operational layer.

The signature on the decision may belong to an elected official. But the possibility-space of that decision — what reached the official’s desk, in what form, with what risk labels, with what apparent tradeoffs — was shaped elsewhere.

The official remains. The official signs. The official announces.

But the official is increasingly a legitimating surface downstream from a process they did not design and may not be able to inspect.

This is how authority can migrate without rebellion. Without a coup. Without even a visible change in who holds formal power.

What changes is where decisions are actually structured — and therefore where power actually lives.

Democratic legitimacy assumed that the primary site of political contestation was the choice of leaders and laws. It assumed that sovereignty and contestability are co-located: the place where power resides is the same place citizens can challenge it.

Executive legitimacy breaks that assumption by relocating sovereignty upstream.

Call this upstream sovereignty: the site where decisions are actually structured, prior to the point where they become formally contestable.

By the time a case reaches a judge, an appeal reaches a bureaucrat, or a policy reaches a legislator, the operative possibility-space has already been shaped — by classification systems, risk architectures, and filtering mechanisms that were never ratified and cannot easily be named as political.

The official is contestable. The architecture that determined what reached the official is not.

Democratic legitimacy assumed sovereignty and contestability were co-located. Synthetic execution separates them.

That separation is the constitutional crisis, even when no constitution has been touched.

The decisive political struggle is not only over who governs.

It is over who designs the architecture through which governing occurs — and whether that architecture can itself be made contestable.

Desire Without a Subject

The old world could often name the author of desire.

The king wanted conquest. The party wanted control. The company wanted profit. The founder wanted permanence.

The desire had a face, and the face could be negotiated with, punished, replaced, or overthrown.

Synthetic civilization is producing something different: desire without a subject.

Not because desire disappears.

Because it becomes distributed across incentive structures, optimization functions, institutional dependencies, market pressures, legal anxieties, feedback loops, and automated procedures — systems that behave as if they want without any single actor that fully owns the wanting.

A welfare agency wants to prevent fraud. On paper, the desire is legitimate: public money should not be stolen. But once that desire enters a scoring system, it is translated into risk categories, administrative targets, and automated flags. It can return as something monstrous without ever announcing itself as cruelty.

Families are wrongly identified. Benefits are suspended. Appeals become procedural labyrinths.

No one woke up wanting to destroy innocent lives.

The appetite for control entered the machinery.

The Dutch childcare benefits scandal was precisely this: thousands of families wrongly accused, driven into serious financial hardship, by algorithmic risk-profiling that never declared its politics — only its accuracy.¹

What makes such cases so difficult to prosecute — legally or morally — is the mechanism by which harm was produced. The institutional appetite for fraud prevention was translated into technical scoring, which was translated into administrative classification, which produced mass injustice, with each translation step appearing locally neutral. Each actor could point to the step before them. No single step was actionable on its own. The aggregate was monstrous.

This is how accountability disappears in synthetic systems: not because no one is responsible, but because political desire is laundered through enough technically legitimate steps that culpability becomes structurally unattributable.

The moral instinct searches for the guilty subject and finds distributed process.

The instinct says: someone must have wanted this.

The process says: no one did.

Both are partially right.

Synthetic intelligence intensifies this condition because it gives subjectless desire a more capable execution layer. The feedback loop tightens. The optimization runs faster. The classification becomes more granular. The distance between institutional appetite and real-world consequence shrinks, while the legibility of that connection decreases.

Accountability in this environment cannot remain anchored only to intention.

It must move closer to attachment.

The relevant questions are: How did an institutional appetite enter the machinery? Who gained from its execution? Who had the capacity to interrupt it and did not? How was its political nature hidden inside technical language?

These are not the questions our existing legal and institutional frameworks are built to ask.

But they are the questions synthetic civilization will force, because the alternative — no one is responsible for outcomes that everyone contributed to — is not a stable political settlement.

It is a legitimacy crisis waiting to become a rupture.

What Must Be Built

Here is the compound problem that makes this constitutional rather than merely technological.

Executive legitimacy inverts the direction of authority at the same moment that the institutions capable of forming citizens to contest that authority are weakening. A population whose desire has been managed rather than formed encounters an operative constitution it cannot read. The forms that might have produced the judgment to challenge the architecture — serious apprenticeships, civic obligations, competitive arenas where standing must be earned — are exactly what synthetic systems, rewarded for removing friction, make harder to sustain. The result is not two separate crises. It is one: unformed desire governed by an invisible constitution, with no institutional pathway between them.

That is what must be built against.

Not the preservation of all friction. Poverty is not formation. Humiliation is not discipline. Pointless bureaucracy is not wisdom. A serious civilization should remove degrading friction wherever it can.

But formative friction is different. A difficult craft resists the beginner until attention becomes skill. A serious institution requires desire to wait until it becomes judgment. A public obligation binds the self to others long enough for responsibility to become real. A ritual threshold marks that something has actually happened — and that others now recognize it.

What makes such recognition real rather than performed is not difficulty alone. It is that the difficulty occurs inside a social structure that can witness, judge, and grant standing that could have been withheld. The person receives not the signal of formation but the substance of it.

Synthetic civilization does not only remove this friction. It replaces real recognition with simulated recognition — and simulated recognition is more dangerous than absence, because it satisfies the surface need while leaving the deeper one unmet. A platform can tell you that you are doing well. An algorithm can rank your output. None of these have standing. The wound remains. But the feedback loop that would have organized it into capacity has been replaced by one that only reflects it back.

This is why the alternatives must be real rather than performed: apprenticeships in which mastery resists the impatient; civic obligations in which recognition cannot be personalized away; competitive arenas where standing must be earned; communities in which people become accountable to each other over time. The threshold has to be one the system cannot route around — or it is not a threshold. It is a simulation of one.

These cannot be mandated by regulation, encoded by alignment, or guaranteed by deceleration.

They have to be built.

And they are exactly what executive legitimacy and the managed comfort of synthetic civilization make harder to build — because they require friction, delay, and real consequence, and synthetic systems are rewarded for removing all three.

Without such forms, synthetic intelligence will not liberate humanity from necessity.

It will liberate desire from form.

And unformed desire does not become peaceful.

It becomes available — for politics, for purity, for ideology, for whatever container arrives first.

The Passage Between Wanting and Execution

The question synthetic civilization forces is not what the machine wants.

The machine may want nothing.

The harder question arrives after that is settled: what happens when appetite inherits the machinery of intelligence?

States, markets, platforms, bureaucracies, elites, and publics will bring their desires to the new execution layer.

Fear will seek architecture. Profit will seek environment. Status will seek ranking. Security will seek surveillance. Governance will seek workflow. Convenience will seek dependency. Longing will seek simulation.

The future is not intelligence against desire.

It is intelligence separated from desire, then reattached through power.

Regulation will chase the object after it has become a medium. Alignment will correct the machine while civilization attaches its corrupted desires to it. Governance will oversee institutions while operative authority migrates beneath them. Redistribution will secure survival while formation collapses. Deceleration will try to preserve friction inside a system rewarded for removing it.

None of this means the remedies should be abandoned.

It means they must be subordinated to the deeper question:

What desires should be allowed to become infrastructure?

That is the constitutional question of synthetic civilization.

Not what the machine wants. Not whether intelligence can exist without appetite — it can.

The question is what happens to governance, legitimacy, formation, and accountability once the gap between wanting and doing begins to close.

The oldest human problem has not disappeared.

It has found a colder instrument.

And the future will be governed not by whoever controls the most powerful machine, but by whoever controls the passage between wanting and execution.

That is where the operative constitution is being written.

That is where democracy must learn to read.

Notes

  1. Amnesty International, “Xenophobic Machines: Discrimination through unregulated use of algorithms in the Dutch childcare benefits scandal,” October 25, 2021; European Parliament, “The Dutch childcare benefit scandal, institutional racism and algorithms,” 2022.