The Internet Is Getting a Second Mode

The companies that optimize for AI agents first will dominate the next era of commerce. Most businesses don't even know there's a race.

Written by Vishisht Choudhary

The web treats all non-human traffic the same: block it. That made sense when the only non-human visitors were scrapers and spam bots. It doesn't make sense when an agent is trying to buy someone a flight.

The internet was built on a simple assumption: if you're not human, you're probably hostile.

That assumption shaped everything — CAPTCHAs, rate limits, bot detection, robots.txt. For decades, it worked. Non-human traffic mostly meant scrapers, crawlers, and spam. You blocked it and moved on.

But that model is starting to break. More than half of all web traffic is now automated. Imperva's 2025 report was the first time in a decade that non-human traffic overtook human traffic: 51%. Most of it is still junk. But the composition is changing — and the new arrivals behave differently.

Some of that traffic is now agents: software acting on behalf of real people with real goals. At the same time, the economic incentive is becoming clear. McKinsey estimates $3 to $5 trillion in agent-mediated commerce by 2030. a16z argues that people will increasingly interface with the web through agents starting now.

The shift has started. The infrastructure hasn’t caught up.

That creates a gap. Anthropic’s research notes that current AI usage likely “skews toward early adopters,” and their Economic Index shows higher adoption in wealthier regions. The people building and talking about agents are still a small, self-reinforcing group, which means most businesses aren’t adapting yet.

That gap between what’s happening and what people think is happening is where the opportunity sits. If you start building for agents now, you’re not competing with everyone. You’re competing with almost nobody.

Bots are not agents

The internet has always had non-human actors. Scrapers, crawlers, spam bots. We treated them all the same: block, limit, filter. If it wasn’t human, it was hostile.

Agents break that model. The difference is simple: bots execute instructions. Agents pursue goals.

A bot runs a script. Go to this URL, grab this selector, return the price. If the page changes, it fails.

An agent starts with intent. It adapts. It figures out what to do when the environment changes. A taxonomy paper in Information Fusion formalizes this distinction, but the intuition is straightforward.

Yes, modern scrapers are getting smarter. They use LLMs, they adapt, they recover from broken selectors. But that doesn’t change what they are. A smarter scraper is still a scraper.

The real distinction isn’t technical. It’s economic. A scraper extracts data for whoever deployed it. An agent acts on behalf of a specific person toward a specific goal. That difference matters.

Visa’s Trusted Agent Protocol exists to separate credentialed agents from anonymous bots. IEEE Spectrum puts it clearly: the web was designed for humans, and agents don’t behave like humans.

Serve humans.
Facilitate agents.
Block bots.

Right now, the internet can’t do that. And until it can, the agent economy is constrained by infrastructure that treats it as a threat.

Why the current internet fails agents

Take a simple task: booking a flight.

You open multiple tabs, compare prices, cross-check dates, read the fine print. 70% of US travelers find the process stressful. Cart abandonment sits around 70%, even higher for airlines.

This is already inefficient for humans. It should be trivial for agents. It isn’t. The constraints are structural.

Amadeus, Sabre, and Travelport control roughly 97% of global flight bookings. Access is restricted, expensive, and built for systems designed decades ago. Travel is just one example. The same pattern exists across commerce.

As McKinsey puts it: if your catalog and policies aren’t machine-readable, agents won’t find you.

The fallback is scraping. And scraping doesn’t scale.

ScrapeOps estimates that 10–15% of crawlers require weekly maintenance. Up to 70% of cost is spent just keeping them alive.

This is backwards. We’re solving a server-side problem with fragile client-side hacks.

The entities that own the data should decide how it is exposed to agents. Not scrapers reverse-engineering DOM structures. Protocols are emerging, but adoption at the storefront layer is where everything slows down.

OIAT: the four pillars of an agent-ready internet

If the internet were rebuilt for agents, what would it need? We keep coming back to four components: Observability, Integration, Analytics, and Trust. We call it OIAT.

There’s an obvious objection: agents don’t reliably work yet.

That’s true. But it misses the point. They fail because the infrastructure they depend on doesn’t exist. The internet wasn’t designed for them. OIAT describes the infrastructure required for agents to stop failing.

Gartner estimates that 40% of enterprise applications will include task-specific agents by 2026. The applications are coming. The underlying systems are not.

The OIAT Stack

Observability
See which agents visit and what they do
Integration
Structured protocols for agent-server communication
Analytics
New metrics for agent conversion funnels
Trust & Identity
Distinguish legitimate agents from malicious bots

Observability

You can’t optimize what you can’t see. Current analytics measure human behavior — clicks, sessions, page views. They say nothing about agents.

Which agents are visiting? What are they parsing? Where do they fail?

Quantum Metric points out that the next visitor may not be human. Dark Visitors is an early attempt to track this shift. Without observability, everything else is guesswork.

Integration

Protocols are emerging fast. MCP, Skills, UCP, ACP. MCP alone reached 97 million monthly SDK downloads and is now backed by major players across the ecosystem.

Everyone is building infrastructure. Very few are actually connecting it to real systems.

That gap — between protocol and implementation — is where most of the friction lives.

Analytics

Observability tells you what happened. Analytics tells you what it means.

Agent behavior is fundamentally different from human behavior. Agents don’t browse. They evaluate and decide. That requires new metrics: discoverability, machine-readability, structured data quality, agent conversion.

The attention economy is shifting from humans to agents. The measurement layer hasn’t caught up.

Trust And Identity

Not all non-human traffic is equal. But the internet still treats it that way.

Strata estimates over 45 billion non-human identities. ISACA notes that existing identity systems can’t distinguish between good and bad actors.

At the same time, agents introduce new privacy challenges. Acting across multiple services means exposing user preferences across contexts.

Identity for agents isn’t just authentication. It’s trust, delegation, and privacy combined.

GEO is not enough

Most companies approach this shift through GEO — Generative Engine Optimization.

The idea is simple: structure your content so it appears in AI-generated answers. It works. Research from Aggarwal et al. shows up to 40% visibility gains.

But GEO solves a narrow problem. It helps agents find you. It doesn’t help them transact with you.

Discovery is only one layer. Understanding, execution, verification, and measurement all sit on top of it.

Treating GEO as the full strategy repeats the early SEO mistake: optimizing for visibility without building the underlying system.

You need the full stack.

The timeline

Human-Only Internet
1990s–2010s
Phase 1
Human-Only Internet
1990s–2010s
Phase 2
Hybrid Mode
2024–2026
Phase 3
Dual-Mode Internet
2027+
YOU ARE HERE
1990s20242027Future

Phase 1: Human-only (1990s–2010s).
The web was built for humans. Bots were noise. We blocked them.

Phase 2: Hybrid (2024–2026).
Agents appear, but the system hasn’t adapted. Traffic is already majority non-human. Companies focus on visibility (GEO), not infrastructure.

This is where we are.

Phase 3: Dual-mode (2027+).
The web supports both humans and agents natively. Protocols mature. Identity works. Analytics diverge.

Gartner estimates that 90% of B2B transactions will be agent-mediated by 2028.

The companies that build during Phase 2 are the ones that benefit in Phase 3.

The window

A new actor has entered the internet.

It has intent. It represents a real person. It is trying to spend money. Most businesses still treat it like a bot.

That won’t last.

The window is open. It won’t stay open.

Toffee