# Agent-Ready Infrastructure for This Site

We transformed our own site into agent-ready infrastructure. Full schema coverage, Markdown twins for every content page, and a Cloudflare edge middleware that serves clean Markdown to AI user agents.

**Stack:** Cloudflare Pages Functions · JSON-LD · schema.org · Claude Code · Model Context Protocol (MCP)
**Time:** ~2 hours (Phase 7 of this site's build)
**Cost:** $0 (Cloudflare Pages free tier)
**Shipped:** Live at winstondigitalmarketing.com

## Context

Most agencies pitch "agent-ready" as a future service. We wanted to ship it on our own site first so the pitch comes with a live, inspectable proof point you can verify with one curl command.

## Why not llms.txt

llms.txt is a pledge, not a feature. No version, no enforcement, no observability, no guarantee a bot honors it. An edge middleware is a live running system. Measurable, tunable, verifiable with one curl command.

## The approach

Three layers:

1. **Schema everywhere.** Organization on every page. Service on service pages. Article on Playbooks and Receipts. FAQPage on anything with an FAQ. Person on About.
2. **Parallel Markdown for every content page.** Every content page has a companion `.md` file with clean heading structure and zero HTML noise.
3. **Cloudflare Pages Function middleware.** Runs at the edge on every request. Detects AI user agents (GPTBot, ClaudeBot, PerplexityBot, CCBot, Google-Extended, anthropic-ai, Bytespider). When matched, serves the `.md` version with `Content-Type: text/markdown`.

### The core of functions/_middleware.js

```javascript
const AI_AGENTS = [
  /GPTBot/i, /ClaudeBot/i, /PerplexityBot/i, /CCBot/i,
  /Google-Extended/i, /anthropic-ai/i, /Bytespider/i,
];

export async function onRequest(context) {
  const { request, next } = context;
  const ua = request.headers.get("user-agent") || "";
  const isAgent = AI_AGENTS.some((rx) => rx.test(ua));

  if (isAgent && request.method === "GET") {
    const url = new URL(request.url);
    const mdPath = toMarkdownPath(url.pathname);
    const md = await context.env.ASSETS.fetch(new Request(url.origin + mdPath));
    if (md.ok) {
      return new Response(await md.text(), {
        headers: {
          "Content-Type": "text/markdown; charset=utf-8",
          "X-Served-To": "ai-agent",
        },
      });
    }
  }
  return next();
}
```

## Verify it yourself

```bash
# Human response
curl -sI https://www.winstondigitalmarketing.com/

# Agent response
curl -sH "User-Agent: ClaudeBot/1.0" https://www.winstondigitalmarketing.com/ | head -20
```

For a human-facing explainer, see `/why-markdown/`.

## What we shipped

- JSON-LD schema on all 28 pages
- Matching Markdown versions of every content page
- Cloudflare Pages Function middleware (`functions/_middleware.js`)
- `/why-markdown/` public explainer

## Lessons

1. The middleware is lower-effort and higher-value than we expected. ~2 hours of engineering. Measurable agent traffic within a day.
2. Observability matters. The `X-Served-To: ai-agent` header lets us measure bot traffic by agent family and tune the detection list.
3. "Agent-ready" and "crawler-ready" are different. Most sites are crawler-ready at best.
4. Schema is necessary but not sufficient. The middleware turns schema into leverage.

## Related

- [See the service](/services/ai-marketing/agentic-web/)
- [Read the methodology](/playbooks/the-complete-geo-audit-methodology/)
