← POSTMAN
Live on this site right now

The Four-Layer
Agentic Web

Most websites are invisible to AI agents. They can't be found, understood, or acted upon. We built a four-layer architecture that fixes this — and we built it for our own site first.

Everything you see on this page — the files, the API endpoints, the MCP server — is live and running at p0stman.com. You're looking at the proof of concept.

The problem with most websites today

Invisible to crawlers

Single-page apps render <div id="root">. There is no content for a crawler to read. LLMs see nothing. Agents see nothing.

Unstructured content

Even crawlable sites bury answers in prose. LLMs can't cite what they can't parse. No schema, no citations.

No agent interface

Humans click buttons. Agents need tools. Without an MCP server or WebMCP, agents have no programmatic way in.

The four layers

Each layer builds on the last. All four are needed. All four are live on this site.

01Discovery

Can agents find this site?

Before an agent can do anything, it needs to find and index the site. This layer is about discoverability.

llms.txt

Standard index file for LLMs — lists every page with descriptions and categories

agents.md

Tells agents what they can do on this site, with full JSON-RPC examples

context.md

Complete company context in a single URL — shareable to any AI conversation

sitemap.xml

145 URLs indexed; submitted to Bing via IndexNow on every deploy

robots.txt

16 AI crawlers explicitly allowed: GPTBot, ClaudeBot, TavilyBot, PerplexityBot and more

Server-side rendering

Next.js App Router — every page renders real HTML. No <div id="root"> for crawlers to bounce off.

02Comprehension

Can agents understand it?

Being findable isn't enough. Agents need structured, machine-readable data — not prose they have to parse.

/api/ai/context

Full company data as structured JSON: services, case studies, pricing, contact methods

/api/ai/services

Every service with name, slug, description, price_from, currency, timeline, URL

/api/ai/portfolio

All case studies with industry, summary, tech stack, timeline, URL

JSON-LD schema

Organization, Service, Article, FAQPage, CaseStudy markup on every page. Crawlable by any agent.

Answer capsules

The direct answer is in the first 30% of every content page — the part LLMs cite 44% of the time.

03Action

Can agents take action?

The third layer is where the architecture becomes genuinely different. Agents can book calls, search the portfolio, and submit enquiries — without any human in the loop.

/api/mcp

Full MCP server — JSON-RPC 2.0 protocol. Any MCP-compatible agent can call it.

mcp.json

Discovery manifest. Agents find the tool list here.

book_discovery_call

Schedule a 30-minute discovery call. No human needs to be involved.

submit_inquiry

Submit a project enquiry directly into the CRM.

get_services / get_portfolio

Browse services and case studies as structured JSON.

navigator.modelContext

WebMCP — all 5 tools registered for browsers with the flag enabled (Chrome 146+). Coming mainstream.

04Agent-to-Agent (A2A)

Can agents orchestrate Zero?

Layer 4 is the full handshake. Other AI agents — not just tools — can discover Zero, send tasks, and receive structured responses. Zero acts as an autonomous agent peer.

/.well-known/agent.json

AgentCard — the machine-readable manifest any A2A-compatible agent discovers first. Name, skills, endpoint, capabilities.

/api/agent

A2A JSON-RPC task endpoint. POST a task, Zero calls Gemini and responds. Real inference with company context injected.

Agent skills: inquire, portfolio, book

Three declared skills — any orchestrating agent knows what Zero can do before sending a task.

Authentication: None

Public discovery endpoint. Any A2A-compatible agent can reach Zero — no API key required.

Session logging

All A2A interactions logged to Supabase agent_sessions — agent UA, task, response, timestamp.

POST https://p0stman.com/api/agent
A2A REQUEST (JSON-RPC 2.0)
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "tasks/send",
  "params": {
    "id": "demo-task",
    "message": {
      "role": "user",
      "parts": [
        {
          "text": "What AI services do you offer for fintech startups?"
        }
      ]
    }
  }
}

Live MCP demo

This is calling the real /api/mcp endpoint right now, from your browser. Any AI agent with MCP support can do exactly this — no human needed.

book_discovery_call
name, email, project_description, preferred_time?

Schedules a free 30-minute discovery call with Paul. Request is stored in the CRM immediately. No human action required to receive it.

submit_inquiry
name, email, message, project_type?

Submits a detailed project enquiry. Equivalent to filling in the contact form — stored in Supabase, Paul is notified.

get_services
category?

Returns all POSTMAN services with name, description, price_from, currency, timeline, and URL. Optionally filter by category.

get_portfolio
industry?

Returns all case studies with title, industry, summary, tech stack, and URL. Optionally filter by industry (e.g. hospitality, fintech).

search_content
query

Searches across services and case studies by keyword. Returns matched service names and case study titles with URLs.

tools/list
(none)

Returns the full list of available tools with their input schemas. Standard MCP discovery — any agent should call this first.

POST https://p0stman.com/api/mcp
REQUEST BODY
{
  "jsonrpc": "2.0",
  "method": "tools/list",
  "id": 1,
  "params": {}
}

Live A2A handshake demo

This fires a real A2A tasks/send request to /api/agent. Zero (Gemini 2.0 Flash) processes it and responds. Not a mock — real inference, right now.

POST https://p0stman.com/api/agent
A2A REQUEST (JSON-RPC 2.0)
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "tasks/send",
  "params": {
    "id": "demo-task",
    "message": {
      "role": "user",
      "parts": [
        {
          "text": "What AI services do you offer for fintech startups?"
        }
      ]
    }
  }
}
It works. Here's proof.

ChatGPT cited us, unprompted

A clinic manager in Dubai asked ChatGPT about voice agents for appointment booking. ChatGPT cited p0stman.com/locations/dubai/ — specifically the answer capsule at the top of that page, which directly answered her query.

She clicked through, read the page, and submitted a project enquiry the same session. The enquiry converted to a discovery call. Zero ad spend. Zero cold outreach. No SEO campaign. The architecture did the work.

The answer capsule on that page said: "POSTMAN builds AI voice agents for Dubai clinics and medical centres — handling appointment booking, patient follow-ups, and multilingual reception in Arabic and English." That sentence matched her query closely enough for ChatGPT to surface it as the answer.

LLM traffic converts differently

LLM-referred visitors convert 4.4× better than organic search (Adobe, 2025). Bounce rate is 45% lower — they arrive with high intent, having already had the question answered by an AI before clicking.

We track LLM source in our own Supabase analytics — UTM source, referrer domain, and traffic category. ChatGPT automatically appends ?utm_source=chatgpt.com to links it cites, so attribution is captured even when the referrer header is stripped.

Most sites don't track this at all. Most sites don't even render content that LLMs can read. That's the gap this architecture closes.

What an agent can do with POSTMAN right now

No human in the loop. No form-filling. Pure agent-to-server communication.

Book a discovery call

An agent representing a client can call book_discovery_call with name, email, and project description. The request lands directly in our CRM.

Submit a project enquiry

submit_inquiry with a detailed brief. Same result as filling in the contact form — stored in Supabase, Paul gets notified.

Research our services

get_services, get_portfolio, search_content — any agent doing research on AI studios can pull structured data without scraping HTML.

We build this for clients

The architecture is proven. We built it on our own site. Every component — the MCP server, the AI endpoints, the schema markup, the llms.txt, the IndexNow pipeline — is a repeatable pattern we can apply to any existing website or new build.

Agentic Web Audit
From £1,500~$1,900
1 week
  • Full audit of current agent-visibility
  • Gap analysis across all four layers
  • Prioritised implementation plan
  • llms.txt, agents.md, robots.txt setup
Full Four-Layer Build
From £5,000~$6,400
1–2 weeks
  • All four layers implemented
  • MCP server with your business tools
  • AI-readable JSON endpoints
  • IndexNow pipeline for ongoing indexing
  • WebMCP browser registration
  • A2A agent endpoint with Gemini inference
Talk to us about your site
The discovery layer

These are the forces that decide who gets found.

AI assistants, agent search engines, and reasoning models are replacing the query box. If your content isn't structured for them — if it's not in their training data, not indexable by their crawlers, not callable by their agents — you're invisible to an audience that is already making purchasing decisions without a single click on Google.

ChatGPT
ChatGPT
Claude
Claude
Gemini
Gemini
Grok
Grok
Perplexity
Perplexity
Meta AI
Meta AI

This is not the future. It is already happening. ChatGPT cites sources in every response. Perplexity crawls and indexes in real time. Claude reads your llms.txt before every session. Grok mines X for brand signals. Gemini surfaces structured schema directly in Google results. The question is not whether these systems will find your business — it is whether they will find it accurately, trust it, and act on it. That is exactly what the four-layer architecture solves.

Frequently asked questions

What does an Agentic Web build include?+
A full four-layer build includes: Layer 1 (Discovery) — llms.txt, agents.md, context.md, sitemap setup, robots.txt configuration, IndexNow integration for real-time search engine submission; Layer 2 (Comprehension) — JSON-LD schema on every page, structured AI-readable REST endpoints (/api/ai/context, /api/ai/services, /api/ai/portfolio), and answer capsule implementation across content pages; Layer 3 (Action) — a full MCP server with JSON-RPC 2.0 protocol, custom tools for your business actions (bookings, enquiries, search), and WebMCP browser registration for Chrome. Layer 4 (Agent-to-Agent / A2A) — AgentCard at /.well-known/agent.json, A2A JSON-RPC task endpoint at /api/agent, Zero responds autonomously to other agents using Gemini inference with your business context injected. Delivered in 1–2 weeks from £5,000 (~$6,400).
Does my site need to be rebuilt in Next.js?+
No. The Discovery and Comprehension layers can be added to any existing site regardless of framework. llms.txt, agents.md, and JSON-LD schema work on WordPress, Webflow, Squarespace, and any static HTML site. The MCP server (Layer 3) requires a server-side runtime — this can be deployed as a standalone API alongside your existing site without touching the frontend. The only scenario where a rebuild makes sense is if your site is a single-page app (React, Vue, Angular) that renders client-side only — those are genuinely invisible to AI crawlers.
How is this different from regular SEO?+
Traditional SEO optimises for keyword-matching algorithms in search engines. The Agentic Web is optimised for language models that read, reason about, and cite content — and for AI agents that take actions on behalf of users. The signals are different: LLMs prioritise direct answers in the first 30% of content, structured schema, and machine-readable endpoints. An LLM doesn't care about keyword density or backlink count. It cares whether the page directly answers the question being asked.
How do I know if my site is already being cited by AI?+
The clearest signal is UTM attribution: ChatGPT automatically appends ?utm_source=chatgpt.com to links it cites. If you have any analytics at all, you can check for this parameter in your traffic. Perplexity sends a referrer header from perplexity.ai. Claude and other models strip referrers but are working on attribution standards. A simpler test: search for your core topic in ChatGPT or Perplexity and see if your site appears in citations.
Can any AI agent call the POSTMAN MCP server right now?+
Yes. The MCP server at https://p0stman.com/api/mcp is live and publicly accessible. Any MCP-compatible client can POST a JSON-RPC 2.0 request to discover tools (method: tools/list) and call them (method: tools/call). No API key required. The available tools are: book_discovery_call, submit_inquiry, get_services, get_portfolio, and search_content. The discovery manifest is at https://p0stman.com/mcp.json.
What is WebMCP and when does it matter?+
WebMCP is a W3C-proposed browser standard (navigator.modelContext) that lets websites register tools directly with AI models running in the browser. When a user has an AI assistant in their browser (Chrome 146+ with the flag enabled, and eventually mainstream), that assistant can discover and call the registered tools without any external server. POSTMAN registers all 5 MCP tools via WebMCP on every page load. The flag is currently opt-in but expected to become default — similar to how Web Bluetooth or Geolocation went from experimental to standard.
// This page is itself part of the four-layer architecture.
// It is server-rendered (Layer 1), carries JSON-LD schema (Layer 2),
// accessible via the MCP server at /api/mcp (Layer 3),
// and orchestratable by other agents via A2A at /api/agent (Layer 4).
AGENT INTERFACE ACTIVE · MCP: p0stman.com/api/mcp · 5 TOOLS REGISTERED · [DISCOVERY] llms.txt · agents.md · context.md · sitemap.xml · robots.txt · TavilyBot ALLOWED · ClaudeBot ALLOWED · GPTBot ALLOWED · PerplexityBot ALLOWED · [COMPREHENSION] JSON-LD schema · /api/ai/context · /api/ai/services · /api/ai/portfolio · [ACTION] book_discovery_call · submit_inquiry · get_services · get_portfolio · search_content · [A2A] AgentCard: /.well-known/agent.json · Task endpoint: /api/agent · A2A JSON-RPC 2.0 · navigator.modelContext REGISTERED · WebMCP: 5 TOOLS · INDEXNOW: 145 URLs · Bing NOTIFIED · AGENT INTERFACE ACTIVE · MCP: p0stman.com/api/mcp · 5 TOOLS REGISTERED · [DISCOVERY] llms.txt · agents.md · context.md · sitemap.xml · robots.txt · TavilyBot ALLOWED · ClaudeBot ALLOWED · GPTBot ALLOWED · PerplexityBot ALLOWED · [COMPREHENSION] JSON-LD schema · /api/ai/context · /api/ai/services · /api/ai/portfolio · [ACTION] book_discovery_call · submit_inquiry · get_services · get_portfolio · search_content · [A2A] AgentCard: /.well-known/agent.json · Task endpoint: /api/agent · A2A JSON-RPC 2.0 · navigator.modelContext REGISTERED · WebMCP: 5 TOOLS · INDEXNOW: 145 URLs · Bing NOTIFIED ·