That is Half 3 in a five-part collection on optimizing web sites for the agentic internet. Half 1 lined the evolution from web optimization to AAIO. Half 2 explored get your content material cited in AI responses. This text goes deeper: the protocols forming the infrastructure layer that make every part else doable.

The early internet wanted HTTP to move information, HTML to construction content material, and the W3C to maintain everybody constructing on the identical basis. With out these shared requirements, we’d have ended up with a fragmented assortment of incompatible networks as a substitute of a single internet.

The agentic web is at that very same inflection level. AI brokers want standardized methods to hook up with instruments, discuss to one another, question web sites, and perceive codebases. With out shared protocols, each AI vendor builds proprietary integrations, and the outcome is identical fragmentation the early internet narrowly averted.

4 protocols are rising because the foundational layer. This text covers what each does, who’s behind it, and what it means for what you are promoting. All through this collection, we draw solely from official documentation, analysis papers, and bulletins from the businesses constructing this infrastructure.

Why Requirements Matter

Take into account how the unique internet got here collectively. Within the early Nineteen Nineties, competing browser distributors and incompatible requirements have been fragmenting what ought to have been a unified community. The W3C introduced order by establishing shared protocols. HTTP dealt with transport. HTML dealt with construction. Everybody agreed on the foundations, and the online took off.

AI is at an analogous crossroads. Proper now, each main AI firm is constructing brokers that have to work together with exterior instruments, information sources, different brokers, and web sites. With out requirements, connecting what you are promoting methods to AI means constructing separate integrations for Claude, ChatGPT, Gemini, Copilot, and no matter comes subsequent. That’s the M x N drawback: M completely different AI fashions occasions N completely different instruments equals an unsustainable variety of customized connections.

What makes this second exceptional is who’s constructing the answer collectively. On Dec. 9, 2025, the Linux Basis announced the Agentic AI Foundation (AAIF), a vendor-neutral governance physique for agentic AI requirements. Eight platinum members anchor it: AWS, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI.

OpenAI, Anthropic, Google, and Microsoft. Competing on AI merchandise, collaborating on AI infrastructure. As Linux Basis Govt Director Jim Zemlin put it: “We’re seeing AI enter a brand new part, as conversational methods shift to autonomous brokers that may work collectively.”

This can be a greater deal than most individuals notice. Rivals constructing shared infrastructure as a result of all of them acknowledge that proprietary requirements would maintain again all the ecosystem, together with themselves.

MCP: The Common Adapter

What it’s: The Model Context Protocol (MCP) is an open customary for connecting AI purposes to exterior instruments, information sources, and workflows.

The official analogy is apt:

Think of MCP like a USB-C port for AI applications. Simply as USB-C offers a standardized option to join digital gadgets, MCP offers a standardized option to join AI purposes to exterior methods.”

Earlier than MCP, for those who needed your database, CRM, or inside instruments accessible to an AI assistant, you needed to construct a customized integration for every AI platform. MCP replaces that with a single customary interface. Construct one MCP server to your information, and each MCP-compatible AI system can hook up with it.

The numbers are hanging. MCP launched as an open-source challenge from Anthropic on Nov. 25, 2024. In simply over a yr, it reached 97 million monthly SDK downloads throughout Python and TypeScript, with over 10,000 public MCP servers constructed by the neighborhood.

The adoption timeline tells the story. Anthropic’s Claude had native MCP help from day one. In March 2025, OpenAI CEO Sam Altman announced help throughout OpenAI’s merchandise, stating: “Individuals love MCP and we’re excited so as to add help throughout our merchandise.” Google adopted in April, confirming MCP help in Gemini. Microsoft joined the MCP steering committee at Construct 2025 in Might, with MCP help in VS Code reaching general availability in July 2025.

From inside experiment to trade customary in 12 months. That tempo of adoption alerts one thing actual.

What this implies for what you are promoting: In case your information, instruments, or companies are MCP-accessible, each main AI platform can use them. That’s not a theoretical profit. It means an AI assistant serving to your buyer can pull real-time product availability out of your stock system, verify order standing out of your CRM, or retrieve pricing out of your database, all via one standardized connection moderately than platform-specific integrations.

A2A: How Brokers Speak To Every Different

What it’s: The Agent2Agent protocol (A2A) permits AI brokers from completely different distributors to find one another’s capabilities and collaborate on duties.

If MCP is how brokers hook up with instruments, A2A is how brokers join to one another. The excellence issues. In a world the place companies use AI brokers from Salesforce for CRM, ServiceNow for IT, and an inside agent for billing, these brokers want a option to uncover what one another can do, delegate duties, and coordinate responses. A2A offers that.

Google launched A2A on April 9, 2025 with over 50 know-how companions. By June, Google donated the protocol to the Linux Foundation. By July, version 0.3 shipped with over 150 supporting organizations, together with Salesforce, SAP, ServiceNow, PayPal, Atlassian, Microsoft, and AWS.

The core idea is the Agent Card: a JSON metadata doc that serves as a digital enterprise card for brokers. Every A2A-compatible agent publishes an Agent Card at a standard web address (/.well-known/agent-card.json) describing its identification, capabilities, expertise, and authentication necessities. When one agent wants assist with a process, it reads one other agent’s card to grasp what that agent can do, then communicates via A2A to request collaboration.

Google’s personal framing of how these items match collectively is helpful: “Build with ADK, equip with MCP, communicate with A2A.” ADK (Agent Improvement Package) is Google’s framework for constructing brokers, MCP offers them entry to instruments, and A2A lets them discuss to different brokers.

Right here’s a sensible instance. A buyer contacts your organization with a billing query that requires a refund. Your customer support agent (constructed on one platform) identifies the difficulty, passes the context to your billing agent (constructed on one other platform) by way of A2A, which calculates the refund quantity and arms off to your funds agent (one more platform) to course of it. The client sees one seamless interplay. Behind the scenes, three brokers from completely different distributors collaborated via a shared protocol.

The enterprise adoption sign is powerful. When Salesforce, SAP, ServiceNow, and each main consultancy signal on to a protocol inside months, it’s as a result of their enterprise purchasers are already operating into the multi-vendor agent coordination drawback that A2A solves.

NLWeb: Making Web sites Conversational

What it’s: NLWeb (Pure Language Net) is an open challenge from Microsoft that turns any web site right into a pure language interface, queryable by each people and AI brokers.

Of the 4 protocols lined right here, NLWeb is probably the most instantly related to this collection’ viewers. MCP, A2A, and AGENTS.md are primarily developer issues. NLWeb is about your web site.

NLWeb was introduced at Microsoft Build 2025 on Might 19, 2025. It was conceived and developed by R.V. Guha, who joined Microsoft as CVP and Technical Fellow. If that identify sounds acquainted, it ought to: Guha is the creator of RSS, RDF, and Schema.org, three requirements that basically formed how the online organizes and syndicates data. When the individual behind Schema.org builds a brand new internet protocol, it’s value paying consideration.

The important thing perception behind NLWeb is that web sites already publish structured information. Schema.org markup, RSS feeds, product catalogs, recipe databases. NLWeb leverages these present codecs, combining them with AI to let customers and brokers question an internet site’s content material utilizing pure language as a substitute of clicking via pages.

Microsoft’s framing is deliberate: “NLWeb can play a similar role to HTML in the emerging agentic web.” The NLWeb README places it much more instantly: “NLWeb is to MCP/A2A what HTML is to HTTP.”

Each NLWeb occasion is mechanically an MCP server. Which means any web site operating NLWeb instantly turns into accessible to all the ecosystem of MCP-compatible AI assistants and brokers. Your web site’s content material doesn’t simply sit there ready for guests. It turns into actively queryable by any AI system that speaks MCP.

Early adopters embody Eventbrite, Shopify, Tripadvisor, O’Reilly Media, Common Sense Media, and Hearst. These are content-rich web sites that already make investments closely in structured information. NLWeb builds instantly on that funding.

Right here’s what this seems to be like in apply. As an alternative of a person navigating Tripadvisor’s search filters to seek out family-friendly eating places in Barcelona with out of doors seating, an AI agent may question Tripadvisor’s NLWeb endpoint: “Discover family-friendly eating places in Barcelona with out of doors seating and good critiques.” The response comes again as structured Schema.org JSON, prepared for the agent to current to the person or act on.

If what you are promoting has already invested in Schema.org markup (and Half 2 of this collection defined why it’s best to), you’re nearer to NLWeb readiness than you may suppose.

AGENTS.md: Directions For AI Coders

What it’s: AGENTS.md is a standardized Markdown file that gives AI coding brokers with project-specific steering, primarily a README written for machines as a substitute of people.

This protocol is much less instantly related to the entrepreneurs and strategists studying this collection, nevertheless it’s an vital piece of the whole image, particularly in case your group has growth groups utilizing AI coding instruments.

AGENTS.md emerged from a collaboration between OpenAI Codex, Google Jules, Cursor, Amp, and Manufacturing unit. The issue they have been fixing: AI coding brokers want to grasp challenge conventions, construct steps, testing necessities, and architectural selections earlier than they will contribute helpful code. With out specific steering, brokers make assumptions that result in inconsistent, buggy output.

Since its launch in August 2025, AGENTS.md has been adopted by over 60,000 open-source projects and is supported by instruments together with GitHub Copilot, Claude Code, Cursor, Gemini CLI, VS Code, and lots of others. It’s now ruled by the Agentic AI Basis, alongside MCP.

The file itself is easy. Plain Markdown, sometimes below 150 traces, masking construct instructions, architectural overview, coding conventions, and testing necessities. Brokers learn it earlier than making any modifications, getting the identical tribal information that senior engineers carry of their heads.

GitHub stories that Copilot now generates 46% of code for its customers. When practically half of code is AI-generated, having a regular manner to make sure brokers observe your conventions, safety practices, and architectural patterns isn’t elective. It’s high quality management.

Why this issues for what you are promoting: In case your growth groups use AI coding instruments (and most do), AGENTS.md ensures these instruments produce code that matches your requirements. It reduces agent-generated bugs, cuts onboarding time for AI instruments on new initiatives, and offers consistency throughout groups.

How They Match Collectively

These 4 protocols aren’t competing. They’re complementary layers in the identical stack.

Protocol Created By Goal Net Analogy
MCP Anthropic Join brokers to instruments and information USB ports
A2A Google Agent-to-agent communication E-mail/messaging
NLWeb Microsoft Make web sites queryable by brokers HTML
AGENTS.md OpenAI + collaborators Information AI coding brokers README recordsdata
AAIF Linux Basis Governance and requirements physique W3C

The stack works like this: MCP offers the plumbing for brokers to entry instruments and information. A2A permits brokers to coordinate with one another. NLWeb makes web site content material accessible to all the ecosystem. AGENTS.md ensures AI coding brokers construct accurately. And the Agentic AI Basis offers the governance layer, guaranteeing these protocols stay open, vendor-neutral, and interoperable.

The parallel to the unique internet is not possible to disregard:

  • HTTP (transport) maps to MCP (device entry) and A2A (agent communication).
  • HTML (content material construction) maps to NLWeb (web site content material for brokers).
  • W3C (governance) maps to AAIF (governance).

What’s completely different this time is the pace. HTTP took years to realize broad adoption. MCP went from launch to common platform help in 12 months. A2A grew from 50 to 150+ accomplice organizations in three months. NLWeb shipped with main writer adoption at launch. AGENTS.md reached 60,000 initiatives inside its first few months.

The infrastructure is being constructed at web pace, not standards-committee pace. That’s partly as a result of the businesses concerned are the identical ones constructing the brokers that want these protocols. They’re motivated.

And these 4 aren’t the one protocols rising. Commerce-specific requirements are constructing the transaction layer: Shopify and Google co-developed the Universal Commerce Protocol (UCP), launched in January 2026 with help from Etsy, Goal, Walmart, and Wayfair. OpenAI and Stripe co-developed the Agentic Commerce Protocol (ACP), which powers Prompt Checkout in ChatGPT. CopilotKit’s AG-UI protocol addresses agent-to-frontend communication, with integrations from LangGraph, CrewAI, and Google ADK. We’ll cowl the commerce protocols in depth in Half 5.

What This Means For Your Enterprise

You don’t have to implement all 4 protocols tomorrow. However it’s worthwhile to perceive what’s being constructed, as a result of it shapes what your web site, instruments, and groups ought to be prepared for.

If you happen to’ve already invested in Schema.org markup, NLWeb is your closest on-ramp. It builds instantly on the structured information you already preserve. As NLWeb adoption grows, your Schema.org funding turns into the muse for making your web site conversationally accessible to AI brokers. Hold your structured information present and complete.

When you’ve got APIs or inside instruments, contemplate MCP accessibility. Making your companies accessible via MCP means any AI platform can work together with them. For ecommerce, that might imply product catalogs, stock methods, and order monitoring turning into accessible to AI purchasing assistants throughout ChatGPT, Claude, Gemini, and no matter comes subsequent.

If you happen to’re evaluating multi-vendor agent workflows, A2A is the protocol to look at. Enterprise organizations operating brokers from a number of distributors (Salesforce, ServiceNow, inside instruments) will more and more want these brokers to coordinate. A2A is the rising customary for that coordination.

In case your growth groups use AI coding instruments, undertake AGENTS.md now. It’s the only protocol to implement (it’s a single Markdown file) and the one with probably the most quick, tangible profit: fewer bugs, extra constant output, quicker onboarding for AI instruments in your codebase.

The underlying message throughout all 4 protocols is identical: the agentic internet is being constructed on open requirements, not proprietary ones. The businesses that perceive these requirements early shall be higher positioned as AI brokers turn out to be a major manner customers work together with companies.

These aren’t issues it’s worthwhile to implement as we speak. However they’re issues it’s worthwhile to perceive, as a result of Half 4 of this collection will get into the technical specifics of creating your web site agent-ready.

Key Takeaways

  • 4 protocols type the agentic internet’s infrastructure. MCP (instruments), A2A (agent communication), NLWeb (web site content material), and AGENTS.md (code steering) are complementary layers, not opponents.
  • The pace of adoption alerts actual urgency. MCP reached 97 million month-to-month SDK downloads and common platform help in 12 months. A2A grew from 50 to 150+ accomplice organizations in three months. These should not experiments.
  • Rivals are collaborating on infrastructure. OpenAI, Anthropic, Google, and Microsoft are all constructing shared protocols below the Agentic AI Basis. This mirrors the W3C second that unified the early internet.
  • NLWeb is probably probably the most related protocol for web site homeowners. Constructed by the creator of Schema.org, it turns your present structured information right into a conversational interface for AI brokers. Each NLWeb occasion is mechanically an MCP server.
  • MCP is the common adapter. Construct one MCP connection to your information, and each main AI platform (Claude, ChatGPT, Gemini, Copilot) can entry it. No extra constructing separate integrations for every platform.
  • Begin with what you may have. Schema.org markup readies you for NLWeb. Present APIs can turn out to be MCP servers. AGENTS.md is a single file your dev group can create as we speak. You don’t want to start out from scratch.

The unique internet succeeded as a result of opponents agreed on shared requirements. The agentic internet is following the identical playbook, simply quicker. The protocols are being established now. The governance is in place. The brokers are already utilizing them.

Up subsequent in Half 4: the hands-on technical information for making your web site prepared for autonomous AI brokers, from semantic HTML to accessibility requirements to testing with actual agent instruments.

Extra Sources:


This publish was initially printed on No Hacks.


Featured Picture: Collagery/Shutterstock


Source link