The new web: building machine-inclusive national digital infrastructure
The internet is entering a structural transition that will shape economic competitiveness, public-service delivery, and national resilience for the next decade. AI systems are beginning to read and interpret online content at scale. Their capabilities are still early, but their growth curve is exponential. Once machines can reliably interpret information, their adoption accelerates far faster than human institutions can adapt. This is not a behavioral trend; it is a computational one. The moment comprehension becomes reliable, usage expands rapidly across sectors.
The web is hostile to machine interpretation
Today's web was not designed for machine interpretation. It is a visual medium built for human eyes, human inference, and human navigation. Machines do not see layout, color, spacing, animation, or implicit meaning. They cannot reliably interpret JavaScript-rendered content or hidden state. They cannot determine authorship, trustworthiness, or the boundaries of an action unless these things are made explicit. As a result, AI systems frequently misinterpret public information, misroute users, or fail silently. This creates risks for public communication, accessibility, service delivery, and regulatory compliance.
This is the invisible failure of the modern web, and it has direct implications for national digital policy.
A second challenge: fragmentation
A second structural challenge is emerging. The agentic web, the ecosystem of AI agents that read, compare, and act on online content, resembles the human web of the mid-1990s. It is fragmented, vendor-led, and incompatible. Each platform is developing its own protocols. Each vendor is shipping its own early lead. The components exist: from machine-readable content formats to agent-to-agent protocols. But the stack has no shared governance, no common venue, and no unifying contract. Without intervention, this fragmentation will harden, creating long-term barriers to interoperability, accessibility, and public oversight.
Addressing these challenges requires a new layer of digital infrastructure.
MX: the contract layer for public content
At the center of this work is MX, the Machine Experience standard. MX is the discipline of adding metadata and instructions to digital content so AI systems do not have to guess. It does not replace existing standards; it complements them. MX provides the explicit meaning, structure, provenance, and boundaries that machines require to interpret information safely and consistently. It transforms public-sector content from a visual artefact into a reliable, machine-readable asset.
COGS: the economics of reliable interpretation
The second component is COGS, the Community-Owned Governance System. COGS is the constitutional framework that ensures MX remains open, neutral, and interoperable. It defines how machine-readable contracts are created, maintained, and validated. Crucially, COGS reduces the need for inference. When a document is governed by a cog, an AI system does not infer; it executes. The meaning is explicit. The workflow is explicit. The provenance is explicit. This shift has significant public-sector implications.
Inference is computationally expensive. Execution is efficient. Inference consumes energy. Execution reduces it. Inference introduces error. Execution increases accuracy.
By reducing inference, COGS reduces compute cost, reduces energy consumption, and increases the reliability of machine-interpreted public information. This directly supports national goals around sustainability, digital trust, and cost-efficient service delivery.
Interoperability across systems and jurisdictions
COGS also enables interoperability across systems, vendors, and jurisdictions. Today, every AI system must interpret every site differently. Every CMS outputs a different flavor of markup. Every vendor invents its own metadata. This fragmentation forces AI systems to perform bespoke inference for every domain they encounter. It is the digital equivalent of every country having its own electrical socket.
COGS standardizes the contract. A cog defines the data, scripts, workflows, and boundaries in a way that any compliant system can understand. This reduces integration cost, improves cross-agency interoperability, and creates a stable foundation for national and international digital infrastructure.
The Gathering: transparent standards governance
The third component is The Gathering, the vendor-neutral standards forum for the agentic web. It provides the public venue that the agent stack currently lacks. Drafts are developed openly. Reviews occur transparently. No single vendor controls the process. This model mirrors the standards-community posture that enabled the modern web to emerge from the fragmentation of the 1990s. It ensures that the machine-inclusive web evolves through public oversight, not proprietary control.
What this means for public institutions
Together, MX, COGS, The Gathering, and Reginald form the architecture of the new web. MX provides the contract. COGS provides the constitution. The Gathering provides the stewardship. Reginald provides the trust layer.
Reginald is the public registry that attests the provenance of any document: who published it, that it has not been modified since publication, and whether it was produced by a human, an AI, or an automated system. For public institutions, this matters on two fronts. AI systems reading attested documents hallucinate less, they cite verified facts rather than inferences, which directly reduces the risk of misinformation propagated through AI-mediated public services. And the EU AI Act, the European Accessibility Act, and digital-records legislation across multiple jurisdictions are converging on the same requirement: that organizations can demonstrate the provenance of content that AI systems acted on. Reginald's attestation, cryptographically verifiable and document-native, answers that requirement at any point in the chain.
For public institutions and national innovation funds, this architecture offers a direct path towards more accurate and accessible public information, reduced energy and compute cost for AI-mediated services, and interoperability across agencies, vendors, and jurisdictions. It supports transparent, community-governed standards, reduces long-term dependency on proprietary ecosystems, and prepares national infrastructure for the rapid expansion of machine-readable services.
Machines are beginning to read. Their growth will outpace human comprehension. The web is not ready. MX provides the missing layer. COGS provides the economic and governance foundation. The Gathering provides the public venue. Reginald provides the trust signal.
The foundation of the machine-inclusive web is being built now.