Attested State Infrastructure

The Economics of Machine-Verifiable State

As software systems increasingly consume and act on shared state — trading, routing, hedging, insuring, lending, settling — they create demand for a new category of infrastructure: standardized, cryptographically signed, machine-verifiable representations of real-world state. AI-assisted development has dramatically reduced the cost of building software, including data integrations. This paper argues that lower build costs do not reduce the need for shared infrastructure — they increase it, by lowering the barrier to duplicated effort while leaving maintenance, operational, and verification costs unchanged. The economics of producing and distributing attested state follow a distinct pattern: high ongoing costs of aggregation and maintenance, near-zero marginal costs of serving, and a verification property that eliminates the need for reputational trust between systems. We call this category attested state infrastructure and describe why it emerges naturally as machine-to-machine coordination scales.

The Economics of Machine-Verifiable State


1. The Problem: Machines Cannot Trust

Human economic actors tolerate remarkable ambiguity in their information sources. A trader reads Bloomberg, cross-references Reuters, adjusts for known biases, and makes a judgment call. The information is consumed through reputation, context, and experience. Trust is probabilistic. It works because humans are good at it.

Software systems are not.

A trading algorithm executing a hedge, a routing engine adjusting a shipping path, or a settlement system resolving a conditional contract cannot rely on reputational trust. It cannot call a sales representative to resolve a discrepancy. It cannot intuit that one data source is more reliable than another based on years of industry experience. It requires:

These are not features. They are preconditions for software systems to consume shared state safely and act on it economically.


2. The Duplication Problem

Consider a concrete example. A developer needs the current implied volatility of Bitcoin options to power a hedging system. To produce this number reliably, the developer — likely working with an AI coding assistant — must:

  1. Integrate with multiple derivatives exchanges (each with its own API format, rate limits, authentication, and failure modes)
  2. Normalize the data across exchanges (different tick sizes, settlement mechanics, and reference prices)
  3. Handle degraded states (exchange outages, stale data, API changes)
  4. Aggregate using a defensible methodology (median, weighted mean, outlier exclusion)
  5. Produce the result in a canonical format that other systems can consume
  6. Run this infrastructure continuously with monitoring, failover, and historical archiving

In 2026, AI-assisted development has compressed the initial build dramatically. What once took an engineer 6-8 weeks can be scaffolded in days. An AI coding agent supervised by a competent developer can generate exchange API clients, build normalization layers, and wire up aggregation logic at a fraction of the historical cost.

This changes the economics of building. It does not change the economics of maintaining.

Exchanges modify their APIs without warning. Rate limits shift. Authentication mechanisms change. Data formats evolve. WebSocket connections drop and must be rebuilt with new handshake logic. Settlement mechanics are updated. New exchanges launch and existing ones delist products. Each of these changes requires detection, diagnosis, and correction — ongoing operational work that compounds over time regardless of how quickly the initial integration was written.

An AI agent can scaffold an integration in hours. Maintaining that integration against the reality of 11 live exchange APIs, each changing independently and unpredictably, is a continuous operational burden that does not compress the same way.

Now consider that 10,000 developers need the same number. Under the current model, each must independently build AND maintain this entire infrastructure. Because AI-assisted development makes building cheap, more developers attempt it than would have in a purely human-engineering world. The barrier to starting is low. The barrier to maintaining reliably is unchanged.

The result is not less duplication. It is more. Lower build costs mean more entities independently constructing and maintaining identical data pipelines. Ten thousand teams — or a hundred thousand — each monitoring the same exchange API changes, patching the same edge cases, and running the same infrastructure. The aggregate maintenance cost scales linearly with the number of independent implementations, regardless of how cheaply each was initially built.

This is pure economic waste. The aggregation and maintenance cost is ongoing. The marginal cost of serving an additional consumer is near zero. The rational outcome is shared infrastructure — not because building is hard, but because maintaining and operating independently is wasteful at scale.


3. The Attestation Property

Shared data infrastructure is not new. APIs have existed for decades. What makes attested state infrastructure distinct is the verification property: cryptographic signatures that allow any consumer to independently verify the integrity, provenance, and temporal accuracy of the data without trusting the provider.

A traditional API asks the consumer to trust the provider. If the provider is compromised, returns stale data, or silently changes its methodology, the consumer has no way to detect this programmatically. Human consumers compensate for this with reputation, contracts, and oversight. Machine consumers cannot.

An attested response includes:

This changes the trust model fundamentally. The consumer does not need to trust the provider's integrity claims — it verifies them cryptographically. Trust shifts from reputation to mathematics.

This is not a marginal improvement. For software systems making economic decisions, it is the difference between a usable data source and an unusable one.


4. The Economics

Attested state infrastructure exhibits a specific economic structure:

High ongoing costs. Integrating multiple sources, normalizing data across formats, building canonical representations, implementing cryptographic signing, and maintaining continuous operation requires significant investment. AI-assisted development has compressed the initial build cost substantially, but the ongoing cost of maintaining integrations against changing APIs, monitoring operational reliability, and handling degraded states remains high — and is incurred continuously regardless of how quickly the system was originally built.

Near-zero marginal costs. Serving an additional query costs fractions of a cent. The infrastructure scales horizontally. The thousandth consumer costs the same as the millionth.

Non-rival consumption. One system consuming a signed Bitcoin volatility reading does not diminish another system's ability to consume the same reading. The data can be served to an unlimited number of consumers simultaneously.

Verification externality. Each signature can be verified by any party without contacting the provider. This creates a positive externality: the more widely the data is consumed, the more independently verified it becomes, increasing confidence in the system without additional cost to the provider.

Standardization returns. When multiple consumers use the same canonical format and methodology, coordination costs decrease across the ecosystem. Systems can compare, audit, and compose attested state from the same provider without translation overhead.

This combination — high fixed costs, near-zero marginal costs, non-rival consumption, and verification externalities — creates strong economies of scale that favor specialized providers over duplicated infrastructure. Not monopoly, but concentration. The same dynamics that produced Bloomberg for financial terminals, Cloudflare for edge infrastructure, and Stripe for payment processing.


5. Historical Precedent: LIBOR and the Cost of Unverifiable Shared State

The economics described above are not theoretical. Financial markets have already experienced what happens when shared state infrastructure lacks cryptographic verification — and what happens when that absence is exploited.

For decades, the London Interbank Offered Rate (LIBOR) functioned as attested state infrastructure for the global financial system. Approximately $350 trillion in derivatives, mortgages, and loans referenced LIBOR as their shared representation of interbank borrowing costs. It was produced by a panel of banks (aggregation), published daily by a central administrator (distribution), and consumed by millions of contracts worldwide (non-rival shared state).

LIBOR exhibited every property of attested state infrastructure except one: independent verifiability. Submissions were self-reported by panel banks. There was no canonical format tied to observable transactions. There was no cryptographic signature. There was no mechanism for a consumer to verify that a submission reflected actual borrowing costs rather than a number chosen to benefit the submitter's trading positions.

The result was one of the largest benchmark-manipulation scandals in financial history. Between 2003 and 2012, traders at multiple banks manipulated LIBOR submissions to profit their derivatives books. The manipulation was possible because the system relied on reputational trust rather than verifiable attestation. Consumers — including pension funds, municipalities, and mortgage holders — had no way to independently verify what they were consuming.

The global response was to replace LIBOR with rates derived from observable transactions (SOFR in the United States, SONIA in the United Kingdom). The transition cost the financial industry an estimated $100-300 billion in systems migration alone. The underlying motivation was precisely the verification property described in this paper: shared state consumed by millions of contracts must be independently verifiable, not dependent on the honesty of its producers.

Attested state infrastructure is the generalization of this lesson. LIBOR was one reference rate for one asset class. As machine-to-machine coordination expands across markets, logistics, insurance, and commerce, the number of shared reference states grows by orders of magnitude. Each one presents the same choice: trust the provider, or verify the attestation.


6. Economic Analysis: The Cost of Duplication

To illustrate the efficiency argument, consider a specific case: producing a normalized, real-time implied volatility surface from cryptocurrency options markets. The estimates below are order-of-magnitude — actual costs vary by team, geography, and scope — but the directional economics are robust.

Illustrative costs under AI-assisted development (single provider):

ComponentBuild (AI-assisted)Build (human-only, historical)Annual Maintenance
Exchange integrations (3 exchanges)3-5 days6-8 weeks$15,000
Normalization layer2-3 days4-6 weeks$5,000
Aggregation methodology1-2 days3-4 weeks$2,000
Canonical format and signing1 day2-3 weeks$1,000
Monitoring, failover, degradation1-2 days2-3 weeks$5,000
Historical archiving and replay1 day1-2 weeks$3,000
Total~2 weeks18-26 weeks$31,000/year

AI-assisted development compresses the initial build from $90,000-$130,000 to roughly $10,000-$20,000 — a 5-10× reduction. This is the reality of software development in 2026.

But the annual maintenance column does not compress. Exchanges change APIs on their own schedule. Monitoring runs 24/7 regardless of how the code was written. Failover handling, data quality checks, and operational reliability are ongoing costs that scale with time, not with initial development speed.

The critical insight: maintenance dominates.

Over a three-year period, the cost profile shifts dramatically:

TimeframeBuild Cost (AI-assisted)Cumulative MaintenanceTotalMaintenance as % of Total
Year 1$15,000$31,000$46,00067%
Year 2$31,000$77,00081%
Year 3$31,000$108,00086%

By year three, 86% of the total cost is maintenance — the component that AI-assisted development does not significantly reduce. The build cost that was compressed 5-10× by AI tooling becomes a rounding error. The dominant economic term is the ongoing operational burden.

The duplication cost at scale:

Number of independent developersTotal 3-year cost (AI-assisted build)Total 3-year maintenance alone
1$108,000$93,000
100$10,800,000$9,300,000
1,000$108,000,000$93,000,000
10,000$1,080,000,000$930,000,000

At 10,000 consumers, the industry spends $930 million over three years on duplicated maintenance alone — not building, but independently monitoring, patching, and operating identical infrastructure against the same set of exchange APIs. This is pure deadweight loss: economic waste that produces no additional information.

The paradox of cheaper building:

Lower build costs make the duplication problem worse, not better. When building an exchange integration required $130,000 and six months of engineering, few developers attempted it independently. The high barrier to entry was itself a natural consolidation force. When AI-assisted development reduces the same build to $15,000 and two weeks, far more developers attempt it. The number of independent implementations increases. The aggregate maintenance waste increases proportionally.

This dynamic intensifies even as AI-assisted maintenance improves. If autonomous systems eventually detect and repair their own integration failures, the cost of any single maintenance event decreases — but the duplication, canonicalization, and verification arguments remain unchanged. Ten thousand AI agents independently self-healing against the same API change is still ten thousand redundant detection and repair cycles. The inefficiency is in the multiplication of effort, not the difficulty of any single maintenance task. The durable economic force is not that maintenance is hard — it is that maintaining the same shared state independently, at scale, is structurally wasteful regardless of how capable each individual maintainer becomes.

This is the counterintuitive core of the economic argument: the cheaper it becomes to build, the more valuable shared infrastructure becomes, because more entities incur the ongoing maintenance cost that shared infrastructure eliminates.

The shared infrastructure alternative:

A single attested provider builds and maintains the infrastructure once. Each additional consumer costs approximately $0.01 per query in serving costs. At 10,000 consumers each making 1,000 queries per year over three years:

MetricDuplicated Model (AI-assisted)Shared Infrastructure
Total 3-year cost$1,080,000,000$408,000 ($108K ops + $300K serving)
Cost per consumer per year$36,000$13.60
Efficiency ratio~2,600×

Even with AI-assisted build costs, the shared model is roughly three orders of magnitude more efficient. The exact ratio matters less than the structural reality: duplicated maintenance of identical state is pure economic waste, and lower build barriers amplify rather than reduce this waste.

The verification premium:

The shared model introduces a dependency: consumers must trust the provider. In a traditional API model, this trust dependency is the primary argument against consolidation — concentrated providers become single points of failure.

Cryptographic attestation resolves this. Because every response is signed and independently verifiable, the trust dependency is eliminated. A consumer can verify every attestation against the provider's public key. A competitor can audit every historical response. A regulator can replay any contested data point. The verification property transforms the consolidation risk from "trusting a single provider" to "verifying a mathematical signature" — a fundamentally different and more manageable risk profile.

This is why attested state infrastructure is economically distinct from traditional SaaS. The verification property makes consolidation rational by eliminating the trust cost that normally limits it.


7. The Verification Externality

A verification externality occurs when each additional verifier increases confidence in a shared attestation system without increasing the provider's trust-establishment costs. This property has no direct analogue in traditional data infrastructure.

In a conventional API, trust is bilateral. Each consumer must independently decide whether to trust the provider. The consumer's confidence is based on reputation, contractual terms, and past experience. One consumer's trust decision provides no information to another consumer. Trust does not compound.

In attested state infrastructure, trust is multilateral and cumulative. When a provider signs a canonical attestation, any party — consumer, counterparty, auditor, regulator, or adversary — can verify the signature against a known public key. This creates several compounding effects:

Replayability. Any historical attestation can be replayed and verified at any future time. Disputes about what the provider attested at a specific moment are resolved mathematically, not contractually. This reduces the cost of dispute resolution to near zero.

Adversarial validation. Consumers, competitors, and adversaries can all verify attestations independently. A provider that signs incorrect data is detectable by any party with the public key and access to the underlying sources. This imposes a continuous integrity constraint that strengthens with the number of observers.

Composable trust. When multiple systems consume the same attested state, they share a common reference point. Two counterparties in a transaction can independently verify that they are acting on the same information. This reduces coordination failures and eliminates an entire category of disagreements about shared state.

Dispute minimization. In settlement systems, insurance adjudication, and conditional contracts, disputes often arise from disagreements about facts: what was the price, what was the weather, what was the exchange rate? Cryptographically signed attestations with deterministic canonical formats reduce factual disputes to signature verification — a computation, not a negotiation.

Zero-cost confidence scaling. Adding a new consumer does not require the provider to do additional trust-building work. The signature mechanizes trust verification. This means confidence in the system scales with adoption at zero marginal cost — the opposite of reputation-based trust, which requires continuous investment.

These effects combine to create an infrastructure category where adoption itself improves the system's trustworthiness — a property that traditional APIs, regardless of their quality, cannot replicate.

8. Beyond Financial Markets

The pattern extends well beyond market data. Any domain where software systems require shared, verifiable representations of real-world state can benefit from attested infrastructure:

Weather and climate state. Logistics systems rerouting cargo, parametric insurance contracts settling claims, agricultural systems adjusting irrigation — all require machine-verifiable weather data with provenance and temporal precision. A signed attestation of rainfall at specific coordinates over a specific window is more useful to an automated system than a probability forecast designed for human interpretation.

Marine and shipping state. Vessel routing systems, port scheduling engines, and cargo insurance platforms require verifiable sea state, vessel positions, and voyage forecasts. Signed attestations of wave height, wind speed, and vessel ETA enable machine-to-machine coordination without human intermediaries.

Economic indicators. Macro-aware systems — those adjusting portfolio allocations, pricing credit risk, or forecasting demand — require verifiable readings of inflation, employment, interest rates, and commodity prices. Canonical, signed representations allow these systems to consume the same data with the same interpretation, reducing coordination failures.

Settlement infrastructure. Discreet Log Contracts and other conditional payment systems require external attestations to settle. A signed statement that "BTC was above $80,000 at timestamp T" enables an on-chain contract to release funds without human adjudication. The attestation is the settlement mechanism.

In each case, the economics are identical: high fixed cost of aggregation, near-zero marginal cost of serving, non-rival consumption, and cryptographic verification that eliminates vendor trust.


9. What Creates Value

It is important to distinguish between the signature and the infrastructure beneath it.

Cryptographic signing is computationally trivial. Any developer can sign data. The signature alone is not a moat. What creates value is the infrastructure that produces the state worth signing:

The signature proves integrity. The infrastructure creates value. Providers that conflate the two will find that signatures are easily replicated while infrastructure is not.


10. Scaling Dynamics

The demand for attested state infrastructure scales with the number of software systems consuming shared state. As more systems participate in markets, logistics, insurance, lending, and commerce, the value of shared, verifiable state increases superlinearly:

The infrastructure is not valuable because automation exists. It is valuable because systems need to coordinate with each other and with the physical world using shared representations of observed state that do not require human intermediation.

This is not a prediction about a distant future. Software systems are already consuming signed market data, settling financial contracts against attested price feeds, and adjusting behavior based on verified environmental state. The infrastructure category exists. It needs a name.

We propose: attested state infrastructure.


11. Conclusion

Attested state infrastructure is the shared, cryptographically signed, machine-verifiable representation of real-world state that software systems require to consume shared information safely and act on it economically. Its economics — high fixed costs, near-zero marginal costs, non-rival consumption, and verification externalities — favor specialized providers over duplicated effort. Its verification property — independent confirmability without vendor trust — is the precondition for machines to consume shared state reliably.

As machine-to-machine coordination scales, demand for this infrastructure will grow across every domain where systems act on shared representations of the world: markets, logistics, insurance, lending, weather, energy, and commerce. The providers that aggregate reliably, normalize rigorously, sign cryptographically, and serve at marginal cost will become the foundational infrastructure of machine coordination.

Not oracles. Not APIs. Attested state infrastructure.



References

About the Author

Jonathan Bulkeley is the founder of Mycelia Signal, a provider of attested state infrastructure for financial markets, weather, marine, and economic indicators.