Skip to main content
Hemlock Collective Logo Hemlock Collective
Working Papers
WP-001

The Architecture of Ignorance: Systemic Risk and the Destruction of Public Signal

Jason I. Oh

Founder, Hemlock Collective · jason@hemlockcollective.com

December 2025 · Version 1.0 · Deconstructing Project 2025's Capture Architecture

Abstract

This essay applies systems engineering frameworks to analyze the deliberate dismantling of federal statistical infrastructure under Project 2025. Drawing on two decades of experience building observability systems for high-stakes financial and technological platforms, I argue that the coordinated attack on agencies such as the Bureau of Labor Statistics, the Census Bureau, and the National Oceanic and Atmospheric Administration represents not "deregulation," but the systematic destruction of the state's capacity to measure reality.

When measurement stops, accountability becomes structurally impossible. Courts cannot review evidence that was never collected. Markets cannot price risks they cannot see. Future administrations inherit a governance system flying blind; unable to diagnose damage or design recovery because the diagnostic logs were never written.

I introduce the concept of epistemic debt, the compounding cost of institutional ignorance, and demonstrate how the administration has weaponized the Administrative Procedure Act's evidentiary requirements to "judge-proof" its deregulation agenda. This is not a political strategy; it is an architectural exploit. The essay concludes by reframing data infrastructure as critical to democratic survival as the power grid or interstate highway system.

Keywords: Democratic backsliding Observability Administrative law Project 2025 Epistemic capture Statistical agencies
JEL: D73, H11, K23, P16

In systems engineering, “observability” is a specific technical term. It gets confused with monitoring, but they are different disciplines. Monitoring is a dashboard of green lights; it tells you if the server is on. Observability is a measure of how well you can understand the internal state of a complex system purely by examining its outputs.

If you lose observability, you don’t just lose data. You lose the ability to distinguish between a system that is functioning and one that is quietly failing. You are flying a 747 over the ocean at night. The altimeter goes dark. You might be at thirty thousand feet, or you might be inches from the water. You won’t know until the impact.

Most political analysis of the administration’s assault on the federal statistical apparatus throughout 2025 has framed the issue through ideology; pundits talk about “deregulation,” “trimming the fat,” or the “war on the administrative state.”

That framing is insufficient. It misses the architectural reality of the last eleven months.

We are watching the deliberate introduction of blindness into what I call the “sovereign stack,” borrowing the term from software architecture, where a “tech stack” refers to the layers of technology (databases, servers, applications) that must work together for a system to function. Here, the sovereign stack is the layered infrastructure of measurement and reporting that allows a government to see itself. The administration is dismantling the sensory array (the network of statistical agencies that function as the state’s eyes and ears). By severing the feedback loops required to navigate reality, they are rendering democratic accountability structurally impossible.

They are bricking the machine.

The Sensor Problem

James C. Scott’s Seeing Like a State is the standard text for understanding how governments measure their populations. Scott argued that modern governance depends on “legibility,” the capacity to render messy human lives into standardized data points that the state can process. Scott was largely concerned with the dangers of this: that a state that sees too much can coerce too effectively.

There is an inverse vulnerability Scott didn’t fully anticipate. Legibility isn’t just a tool for control. It is the precondition for debugging.

For the last century, the United States invested billions in building an observability pipeline. The Bureau of Labor Statistics (BLS) gave us the signal on wages and inflation. The Census mapped the demographic terrain. The National Center for Education Statistics (NCES) tracked whether we were actually teaching children or just warehousing them. The Energy Information Administration (EIA) tracked the grid.

These weren’t just “agencies.” They were sensors; the instrumentation panel that allowed the electorate, and the market, to see the consequences of policy.

Project 2025: The Mandate for Leadership, the blueprint published by the Heritage Foundation, identified these sensors not as public goods, but as vulnerabilities to be patched out. The document explicitly called for placing the Census Bureau and other statistical agencies under direct political supervision within the Department of Commerce to ensure alignment with “conservative principles.”1

The language used to justify this throughout the transition was standard corporate restructuring jargon: “alignment,” “efficiency,” “reducing redundancy.” Anyone who has sat through a merger knows what this actually means. The independent compliance officer is getting fired because he keeps flagging the CEO’s out-of-policy expenses.

We saw the operationalization of this logic in August. BLS Commissioner Erika McEntarfer was fired hours after an unfavorable jobs report.2 The signal wasn’t just degraded; the integrity of the data stream itself became suspect. This created a crisis of confidence that goes beyond politics. Bond traders, union negotiators, and city planners rely on the integrity of that data. When integrity is compromised, the “truth” of the American economy becomes, as Project 2025 intended, a matter of partisan allegiance rather than empirical fact.

The deeper strategy has been subtler than outright manipulation. Manipulation is risky. It leaves fingerprints. The smarter play, the one we saw at the EPA and the Department of Education this fall, is to turn the sensors off. The method of turning off the sensors itself becomes the spectacle that consumes media cycles, distracting from the underlying purpose.

Consider the gutting of the NCES. By reducing staff by 95% and canceling longitudinal studies, the administration isn’t producing biased data on student achievement. They are producing no data. Null sets. If you stop counting maternal mortality rates (as HHS has effectively done by dismantling the review committees), the rate hasn’t gone up or down, it has simply vanished from the dashboard.3 The red light didn’t turn green; someone smashed the bulb, and accountability vanished with it.

Table 1: The Sovereign Stack: Isomorphic Vulnerability Across Technical and Governmental Systems
System LayerIn TechnologyIn GovernmentWhat P2025 Attacks
InterfaceWhat users seePublic-facing policy(leaves visible)
ProcessingBusiness rules, algorithmsRegulatory agencies (EPA, OSHA, SEC)(hollows out)
MeasurementLogs, metrics, monitoringStatistical agencies (BLS, Census, NOAA)✓✓✓ (primary target)
FoundationServers, networks, databasesConstitutional structure (courts, civil service)(secondary target)
Note: Project 2025's primary attack vector targets the Data Layer. By destroying measurement infrastructure, the regime blinds the entire stack without needing to directly control the Application Layer (executive policy) or overcome the Infrastructure Layer (constitutional checks).

The Irreversibility Principle

The “deregulation” frame fails here because it incorrectly assumes reversibility.

In a democracy, we treat policy like a pendulum. If an administration passes a bad tax law, the next administration can repeal it. If they roll back a regulation, it can be reinstated.

Data is temporal. It obeys the laws of entropy. Retroactive reconstruction is impossible.

Take the National Assessment of Educational Progress (NAEP), often called “The Nation’s Report Card.” Its value lies in the trend line spanning half a century. It allows us to see the long-tail effects of No Child Left Behind, or the disruption of COVID-19, or the impact of digital literacy. It is a time-series database of the American mind.

With the NCES dismantled, the 2026 assessment is dead. The 2028 assessment is likely dead. Even if a reform administration takes over in 2029, they cannot go back in time and measure what 4th graders knew in 2026. That gap in the record is permanent. We will never know, with empirical certainty, what happened to a generation of students during this period.

The same logic applies to the EPA’s suspension of fence-line monitoring requirements for coke plants and chemical manufacturers. We aren’t just letting them pollute today. We are ensuring that ten years from now, when leukemia clusters emerge in those zip codes, there will be no baseline data to prove causation. The families will know their children are sick. They will suspect the refinery. But they will never be able to prove it in court—because the evidence was never collected.

What this means concretely: If you’re a parent in Louisiana, you will never know if your child’s school is underperforming relative to national standards because the comparison data is gone. If you’re a worker in a refinery town, you will never be able to prove in court that your child’s leukemia was caused by pollution because the baseline measurements were never taken. If you’re a retiree with a pension fund, you’re now betting your retirement on employment figures that may no longer reflect economic reality.

This distinguishes the current moment from standard propaganda. Propaganda fills the information environment with noise. It fights truth with a torrent of lies. In theory, you can fight back with better reporting, leaks, and whistleblowers.

A void cannot be fought. There is no Wayback Machine for missing data, no leaks from studies that were never commissioned.

By creating these epistemic holes, the administration is achieving something far more durable than a temporary policy win. They are salting the earth for future governance. They are ensuring that even if they lose power, the incoming regime will be flying blind—unable to diagnose the damage or design a recovery because the diagnostic logs were never written.

To understand the full architecture, we have to look at the judiciary. I spent the early part of my career working in M&A due diligence, and one lesson from that world is that the architecture of the record often dictates the outcome of the dispute. You win by controlling what exists to be reviewed.

Administrative law in the United States, the body of law governing what agencies like the EPA or SEC can do, relies heavily on the “Administrative Record.” Under the Administrative Procedure Act (APA), when an agency makes a rule (or repeals one), it must show that its decision wasn’t “arbitrary and capricious.” It must point to the data, the public comments, the scientific studies that justify the action.

Historically, this has been a check on executive power. If the EPA wants to ban a pesticide, it has to show the science. If it wants to un-ban it, it has to show why the science changed.

The loophole opens when the agency stops collecting data.

When the EPA stopped collecting emissions data from polluters in September, they blinded the courts.4 If there is no data in the Administrative Record showing that pollution has increased, a community group suing to force the agency to act has no standing. The agency can simply shrug: We have no evidence of a problem.

By destroying the measurement infrastructure, the administration is effectively “judge-proofing” its deregulation. A court cannot find that an agency ignored the evidence if the agency successfully ensured the evidence never existed. It is a denial-of-service attack on the judicial review process itself.

This is a deeply cynical and sophisticated legal hack. It weaponizes the evidentiary standards of the APA against the public interest. It turns the “presumption of regularity,” the legal idea that officials are doing their jobs, into a shield for malfeasance. The courts, designed to review the record, are rendered impotent when the record is blank.

The Economics of Hidden Extraction

There is a cold economic rationality to this. If you’ve worked in high-frequency trading or risk management, you know that markets run on information symmetry, the idea that all participants have roughly equal access to the facts. The “efficient market hypothesis” assumes that all available information is priced in.

The war on measurement destroys that symmetry to enable rent-seeking (economic extraction without productive contribution). It is a mechanism for transferring value from the public to specific private actors by hiding the cost of the transfer.

Consider the environmental example again. Pollution is an economic externality, a cost of production that isn’t paid by the producer; it is paid by the people breathing the air or drinking the water. Regulation is the attempt to internalize that cost via fines or caps. Measurement is the accounting system that makes the cost visible.

By killing the data stream, the administration allows companies to offload these costs back onto the public balance sheet in the form of respiratory failure, lower property values, and remediation costs, without those costs ever appearing on their corporate ledger. Damages without documentation remain theoretical, unlitigable.

The risk extends to the financial markets themselves. This is the part that should terrify Wall Street, though they seem slow to price it in.

The global financial system uses US Treasury bonds as the “risk-free rate,” the baseline against which all other assets are priced. That “risk-free” status depends on the assumption that the US government is a transparent, predictable counterparty.

We saw the first fracture in this assumption on May 16, when Moody’s downgraded US sovereign debt, citing “institutional durability” and the fiscal recklessness of the proposed tax cuts.5 The market reacted immediately: the 30-year Treasury yield crossed 5%, and the cost of servicing the national debt spiked.

The administration looked at the dashboard, saw the red lights flashing, and realized they could not alter the fiscal reality. So three months later, they smashed the dashboard. The firing of the BLS commissioner in August was not a random act of spite, it was a direct response to the May downgrade. If you cannot satisfy the bond vigilantes with math, you blind them with silence.

When official indicators become suspect, or when the methodology is abruptly changed to hide a recession, capital allocation becomes a guessing game. Pension funds and sovereign wealth managers are now pricing assets against employment figures that may no longer reflect economic reality.

In 2008, we learned what happens when the financial system prices assets against fraudulent risk models (the subprime mortgage crisis). We are now running that experiment on the sovereign economy itself. The bubble isn’t in housing this time. It’s in the institutional credibility of the United States.

If the market decides it can no longer trust BLS data, the “risk premium” on US assets goes up. Borrowing costs rise. The dollar weakens. The “Liar’s Dividend” isn’t just a political concept; it has a basis point value. We are leveraging the credibility of the state to buy short-term political survival. When the margin call comes, it will be devastating.

Breaking the Feedback Loop

Cybernetics (the study of control systems) teaches that self-correction requires feedback. A thermostat cannot regulate temperature if it cannot measure heat. A missile cannot hit a target if it cannot track its own trajectory.

The Founders, though they didn’t use the term “cybernetics,” understood this implicitly. They mandated the Census in Article I of the Constitution because they understood that representation requires measurement. You cannot govern a people you cannot count.

The architecture of ignorance cuts the wire between the governance mechanism and the reality it governs.

If unemployment spikes but the BLS report says it’s flat, the feedback loop is broken. The administration feels no pressure to adjust policy.

If schools fail but the testing data is gone, the feedback loop is broken. Parents cannot organize effectively because they lack the comparative data to prove their district is underperforming.

If the climate warms but the NOAA satellites are defunded, the feedback loop is broken. We lose the lead time necessary for adaptation.

The administration is building a system that is immune to error correction because it refuses to acknowledge the existence of error.

This creates a state of what I call affective sovereignty, governance judged not by measurable outcomes, but by how the leader makes people feel. When you destroy the metrics, governance is no longer assessed by results (which are falsifiable). It is assessed by vibes and whatever the misinformation apparatus produces. The President can claim the economy is the “greatest in history.” Without a credible, independent BLS, that claim floats free of reality.

This is the ultimate goal of the project: to move politics entirely into the realm of the aesthetic. If the numbers are gone, the only thing that matters is the performance. It is the transition from a technocratic democracy, however flawed, to governance as theater.

The Technical Debt of Democracy

In software development, we talk about “technical debt,” the implied cost of additional rework caused by choosing an easy solution now instead of using a better approach that would take longer.

We are accumulating what I call epistemic debt, the compound cost of institutional ignorance. Every month that the injury surveillance programs at the CDC are offline is a month of debt. Every month the Census is manipulated is a month of debt.

The frightening thing about technical debt is that it compounds. The longer you go without fixing the underlying architecture, the harder it becomes to ever fix it. Eventually, the system becomes so brittle that you can’t touch it without breaking it.

We are approaching that point with the federal government. If this architecture of ignorance remains in place for a full term, we may cross the event horizon. The continuity of data will be so thoroughly broken that we essentially have to reboot the state’s knowledge base from zero.

A future administration committed to evidence-based governance will inherit a shattered dashboard. They will face a measurement blackout spanning years. They won’t know the baseline for environmental toxins. They won’t know the true unemployment rate. They won’t know the depth of the learning loss in schools.

A system cannot be patched without knowing its current state, much less optimized.

Defending the Stack

In systems architecture, there is a concept called “Mean Time to Recovery” (MTTR)—the average time it takes to restore a failed system to operational status. The critical question is not if you can recover, but how long recovery takes relative to the damage accumulation rate.

If the system degrades faster than you can repair it, you cross a threshold where recovery becomes impossible. The architecture collapses into an unrecoverable state.

We are approaching that threshold.

The destruction of federal statistical infrastructure is not a policy that can be “repealed” in the way a tax cut can be repealed. Each month of missing data is a permanent gap in the historical record. Each disbanded research team represents lost institutional knowledge that takes decades to rebuild. Each canceled longitudinal study is a severed thread in the fabric of our collective memory.

The standard defense of these agencies, framing them as “neutral arbiters” or “scientific institutions,” misses the point. The politics of measurement are irrelevant here. What’s at stake is the capacity for governance itself.

What Defense Looks Like

Defending observability is not a slogan. It is a technical specification.

First, archival redundancy. Every federal dataset must be mirrored in multiple jurisdictions—state governments, universities, independent research institutions. The data must exist in forms that survive administrative hostility. This is disaster recovery protocol, not resistance.

Second, observability as a constitutional requirement. The Census is mandated in Article I precisely because the Founders understood that representation requires measurement. That principle must extend to all domains where 21st-century governance requires feedback: economic data, environmental monitoring, public health surveillance. A government that deliberately blinds itself is violating the structural preconditions of democratic accountability.

Third, litigation as a forcing function. The Administrative Procedure Act requires agencies to justify their decisions with evidence. When agencies stop collecting evidence, they are not just shirking their duties, they are rendering judicial review impossible. Courts must treat the absence of data collection as prima facie evidence of arbitrary and capricious action. Destroying the record is destroying the rule of law.

The Recovery Calculus

If this administration serves a full term with the current trajectory, we will face a four-year data blackout across multiple critical domains. The gap will be unbridgeable.

A future administration will inherit a government that has been effectively lobotomized. They will not know:

  • The true depth of environmental degradation
  • The actual state of public health
  • The real performance of the education system
  • The baseline conditions from which recovery must begin

They will be governing a nation-sized system with no telemetry, no diagnostics, and no way to distinguish signal from noise.

In software, we call this “flying blind.” In aviation, it’s called “controlled flight into terrain,” the deadliest category of accident, where a functioning aircraft flies into the ground because the crew has lost situational awareness.

The American state is currently in controlled flight into terrain. The altimeter is dark. The pilots are confident. And the mountain is closer than anyone realizes.

There is no recovery from impact.

The only mitigation is to restore observability now, before the continuity of measurement is so thoroughly broken that the next generation of leaders inherits not a damaged system, but an archaeological site.

The defense of federal statistical infrastructure is not about saving jobs or preserving bureaucracy. It is about maintaining the minimum viable feedback loops required to operate a high-complexity civilization.

Without those loops, we are not governing. We are guessing. And at the scale of the American state, guessing is indistinguishable from collapse.


  1. Mandate for Leadership: The Conservative Promise, The Heritage Foundation, 2023, p. 664. ↩︎

  2. Christopher Rugaber, “Trump Demands Official Overseeing Jobs Data Be Fired after Dismal Employment Report,” AP News, August 1, 2025. ↩︎

  3. Alec MacGillis, “Trump’s War on Measurement Means Losing Data on Drug Use, Maternal Mortality, Climate Change and More,” ProPublica, April 18, 2025. ↩︎

  4. Maxine Joselow, “E.P.A. to Stop Collecting Emissions Data From Polluters,” The New York Times, September 12, 2025. ↩︎

  5. Tony Duehren, Andrew Romm and Joe Rennison. “U.S. Downgraded by Moody’s as Trump Pushes Costly Tax Cuts.” sec. Business, The New York Times, May 16, 2025. ↩︎