In January 2026, Google was granted patent US12536233B1. Six engineers labored on it, and it describes a system that scores a touchdown web page on conversion price, bounce price, and design high quality. If the touchdown web page falls under a threshold, generate an AI substitute personalised to the searcher. The advertiser by no means sees it. By no means approves it. May not even understand it occurred.
The talk round this patent has centered on scope: Is it restricted to purchasing adverts, or does it sign one thing broader? That’s the improper query.
The proper query: What occurs while you mix AI-generated pages with AI brokers that browse, store, and transact on behalf of people?
For the primary time, we now have the infrastructure for an internet the place no human creates the web page and no human visits it. Either side may be non-human. That changes everything.
The Provide Facet: AI-Generated Pages
The provision aspect of the online has at all times been human. Somebody designs a web page, writes copy, publishes it. Three developments are altering that.
Google’s patent US12536233B1 is essentially the most direct: Rating a touchdown web page on conversion price, bounce price, and design high quality, then change underperforming pages with AI-generated variations. The substitute pages draw on the searcher’s full search historical past, earlier queries, click on conduct, location, and gadget information. Google builds personalised touchdown pages no advertiser can match, as a result of no advertiser has entry to cross-query behavioral information at that scale. Barry Schwartz covered the patent on Search Engine Land, describing a system the place Google might routinely create customized touchdown pages, changing natural outcomes. Glenn Gabe known as Google’s AI touchdown web page patent doubtlessly extra controversial than AI Overviews. Roger Montti at Search Engine Journal argued the patent’s scope is limited to purchasing and adverts. Each camps agree: the expertise to attain and change touchdown pages with AI exists and works.
NLWeb, Microsoft’s open mission, takes a distinct strategy. NLWeb turns any web site right into a pure language interface utilizing present Schema.org markup and RSS feeds. An AI agent querying an NLWeb-enabled website doesn’t load a web page in any respect. The agent asks a structured query, NLWeb returns a structured reply. The rendered web page turns into elective.
WebMCP goes additional nonetheless. With WebMCP, a web site registers instruments with outlined enter/output schemas that AI brokers uncover and name as capabilities. A product search turns into a perform name. A checkout turns into an API request. WebMCP eliminates the “web page” idea fully, dissolving the online web page as a unit of content material right into a set of callable capabilities.
Every mechanism works in a different way, however the route is identical: the web page is turning into one thing generated, queried, or bypassed fully. The human-designed, human-published internet web page is now not the one method content material reaches an viewers.
The Demand Facet: AI Brokers As Guests
The demand aspect shifted sooner. In 2024, bots surpassed human traffic for the primary time in a decade, accounting for 51% of all internet exercise. Cloudflare’s information exhibits AI “consumer motion” crawling (brokers actively doing issues, not simply indexing) grew 15x throughout 2025. Gartner predicts 40% of enterprise applications will function task-specific AI brokers by finish of 2026, up from lower than 5% in 2025. The dimensions is difficult to overstate.
Agentic browsers are essentially the most seen shift. Chrome’s auto browse turned 3 billion Chrome installations into potential AI agent launchpads. Google’s Gemini scrolls, clicks, fills kinds, and completes multi-step duties autonomously inside Chrome. Perplexity’s Comet browser conducts deep analysis throughout a number of websites concurrently. Microsoft’s Edge Copilot Mode handles multi-step workflows from inside the browser sidebar. The full agentic browser panorama now contains over a dozen client and developer instruments, all searching on behalf of people.
Commerce brokers have moved previous searching into shopping for. OpenAI launched Instantaneous Checkout to let customers buy merchandise immediately inside ChatGPT, powered by Stripe’s Agentic Commerce Protocol (ACP). OpenAI killed the feature in March 2026 after near-zero buy conversions and solely a dozen service provider integrations out of over 1,000,000 promised. The failure was execution, not idea: Alibaba’s Qwen app processed 120 million orders in six days in February 2026 as a result of Alibaba owns the AI mannequin, {the marketplace}, the cost rails (Alipay), and the logistics. OpenAI tried to duplicate agentic commerce with out proudly owning the stack. Google and Shopify’s Universal Commerce Protocol (UCP) connects over 20 firms, together with Walmart, Goal, and Mastercard, in a framework designed for AI brokers to deal with commerce from product discovery by means of checkout. Shopify auto-opted over a million merchants into agentic purchasing experiences with ChatGPT, Copilot, and Perplexity. The transaction occurs in an AI dialog. No checkout web page hundreds.
Agent-to-agent communication removes the human from each ends. Google’s Agent-to-Agent (A2A) protocol lets AI brokers from completely different distributors uncover one another’s capabilities and collaborate on duties with out human mediation. A journey planning agent negotiates immediately with a reserving agent. A procurement agent evaluates provider brokers throughout distributors. Over 150 organizations help A2A, together with Salesforce, SAP, and PayPal, making agent-to-agent commerce and coordination a manufacturing actuality.
When Each Sides Go Non-Human
Till now, one aspect of the online was at all times human. An individual constructed the web page, or an individual visited it. Normally each.
Google’s patent closes the circuit.
Right here’s what an entire non-human stream would possibly appear to be. A consumer tells their AI assistant they want trainers. The assistant queries product information by means of NLWeb or WebMCP, no web page load wanted. The assistant evaluates choices by checking stock throughout retailers through A2A. If the consumer must evaluation a comparability, Google generates a touchdown web page personalised to that particular consumer’s search historical past and preferences. The assistant completes checkout by means of ACP or UCP utilizing Shared Cost Tokens. The consumer receives a affirmation.
The human’s function in that total stream: stating intent and approving the acquisition. Discovery, web page era, product analysis, and transaction completion are all dealt with by AI techniques. The human touches solely the 2 endpoints of the chain.
Each piece of expertise in that chain exists in manufacturing right now. Chrome auto browse is reside for 3 billion Chrome customers. A2A has 150+ organizational supporters. ACP underpins Stripe’s agentic commerce infrastructure (ChatGPT’s Instantaneous Checkout failed on execution, not protocol). UCP connects Shopify, Google, Walmart, and Goal. Patent US12536233B1 is granted. No single firm has assembled the complete loop but, however each part is operational.
Who’s Constructing The Non-Human Net
Right here’s the place it will get attention-grabbing. Map out who’s constructing what, and a sample emerges:
| Layer | What | Who |
|---|---|---|
| Web page era | AI touchdown pages | |
| Content material-as-API | WebMCP, NLWeb | Google, Microsoft |
| Agent infrastructure | MCP, A2A | Anthropic, Google |
| Agent browsers | Chrome, Comet, Copilot | Google, Perplexity, Microsoft |
| Agent commerce | ACP, UCP | Stripe + OpenAI, Shopify + Google |
| Edge supply | Markdown for Brokers | Cloudflare |
Google seems in 5 of six layers: web page era (patent US12536233B1), content-as-API (WebMCP), agent infrastructure (A2A), agent browsers (Chrome auto browse), and commerce (UCP). Google is positioning itself to mediate the non-human internet the identical method Google mediates the human one by means of Search.
The Agentic AI Foundation (AAIF), fashioned beneath the Linux Basis with Anthropic, OpenAI, Google, and Microsoft as platinum members, gives the governance layer. The AAIF capabilities because the W3C for the agentic internet: the vendor-neutral physique that decides which protocols turn into requirements for agent interoperability.
What Web site Homeowners Want To Know
This isn’t an optimization guidelines. It’s three structural shifts in what your web site is for.
Your Information Layer Is Your Web site
Google’s patent generates touchdown pages from product feed information, making product feeds a very powerful asset an ecommerce enterprise maintains. NLWeb queries Schema.org markup as an alternative of rendering pages, making structured markup the entrance door to your content material. WebMCP exposes website capabilities as perform calls, making instrument definitions the consumer interface brokers work together with.
Structured information, product feeds, JSON-LD, and API surfaces have historically been handled as backend infrastructure. Within the non-human internet, these information layers turn into the first method a enterprise reaches prospects. Product feed accuracy (specs, pricing, inventory ranges, pictures) issues greater than homepage design when AI techniques generate the web page from that feed.
Belief Is The Moat
AI can generate a web page. It can’t generate a cause to hunt you out by title.
Direct visitors, e mail subscribers, group members, and model popularity persist when the web page itself turns into replaceable. An AI agent can construct a product web page, however no AI agent can construct the belief that makes a client (or their agent) request a particular model by title.
The brands that matter in the non-human web are those individuals inform their brokers to search out. “Get me a fleece jacket” is a commodity question. “Get me a fleece jacket from Patagonia” is a model moat.
The Measurement Downside
How do you measure a web page you didn’t construct? How do you A/B check in opposition to one thing Google generates dynamically? How do you attribute a conversion that occurred inside ChatGPT, initiated by an agent performing on behalf of a consumer who by no means noticed your web site?
Conventional internet analytics (web page views, classes, bounce price, time on website) assume two issues: a human customer and a web page you management. On the non-human internet, neither assumption holds. A Google-generated touchdown web page isn’t yours. A ChatGPT checkout session doesn’t register in your analytics.
I don’t have a clear reply right here, and neither does anybody else. Measurement is the genuinely unsolved downside of the non-human internet. New metrics might want to observe agent discoverability, agent conversion price, and information feed high quality. However as of March 2026, the measurement infrastructure hasn’t caught as much as the expertise it must measure.
4 Predictions For 2026-2027
4 issues to observe over the following 12-18 months.
Google ships patent US12536233B1, or one thing prefer it. The expertise for scoring and changing touchdown pages exists. The enterprise incentive exists. Google has a historical past of introducing options in adverts first, then increasing (Google Purchasing went from free to paid to important). AI-generated touchdown pages will doubtless seem in purchasing adverts first, then broaden to different verticals. Touchdown web page high quality scores in Google Adverts function the early warning system for which pages Google considers replaceable.
Agent visitors turns into measurable. Analytics platforms might want to distinguish human classes from agent classes. BrightEdge studies AI agents account for roughly 33% of natural search exercise as of early 2026. WP Engine’s visitors information exhibits 1 AI bot visit for every 31 human visits by This autumn 2025, up from 1 per 200 initially of that yr. Agent visitors ratios will speed up additional as Chrome auto browse rolls out globally past the US. New metrics round agent conversion price and agent discoverability will emerge from necessity.
The protocol stack consolidates. MCP, A2A, NLWeb, and WebMCP type a coherent stack protecting instrument entry, agent communication, content material querying, and browser-level integration. Count on extra interoperability between these protocols and fewer competing requirements. The Agentic AI Basis (AAIF) accelerates consolidation. Inside 18 months, “does your website help MCP?” might be as normal a query as “is your website mobile-friendly?”
Model differentiation will get tougher and extra essential. When AI generates pages and brokers do the purchasing, the one defensible place is being the model individuals (and their brokers) hunt down by title. Direct relationships, owned audiences, belief indicators. The whole lot else is a commodity.
The Net Splits In Two
When Shopify auto-opted retailers into agentic purchasing, I requested whether or not your website just became optional. The reply is extra nuanced than elective or important. It’s turning into one thing completely different.
The online isn’t dying. It’s splitting.
The transactional internet (product listings, checkout flows, data retrieval, comparability purchasing) goes non-human first. AI generates the touchdown pages. AI agents visit and transact on these pages. People approve choices on the endpoints. Google’s patent lives within the transactional internet, and the economics of conversion optimization push hardest towards automation on this layer.
The experiential internet (model storytelling, group, content material that rewards sustained consideration, design that creates emotional response) stays human. Not as a result of AI can’t generate model experiences, however as a result of the worth of these experiences comes from the human connection behind them. No one tells their agent to “go get pleasure from a model expertise on my behalf.”
Your web site’s new job description: information supply for the brokers, belief anchor for the people, model dwelling for each. The businesses that deal with their structured data, product feeds, and API surfaces with the identical care they offer their homepage design are those that present up in each worlds.
The non-human internet isn’t changing the human internet. It’s rising alongside it. Your job is to point out up in each.
Extra Sources:
This was initially printed on No Hacks.
Featured Picture: Yaaaaayy/Shutterstock
Source link


