How I Built an Agent-Ready Website with MCP, Agent Discovery and Private AI Workflows
Tech
AI Agents
MCP
Automation
SEO

How I Built an Agent-Ready Website with MCP, Agent Discovery and Private AI Workflows

A technical look at how this site exposes public discovery while keeping write actions, publishing and AI-cost workflows behind private keys.

Uygar DuzgunUUygar Duzgun
Apr 30, 2026
8 min read

Most AI demos fail at the same point: the model can talk, but it cannot safely do useful work inside the systems where the business actually runs. A production-ready agent setup needs more than a prompt. It needs discovery, tools, authentication, scoped access, logging and a workflow that keeps humans in control of important actions.

This website is built as a working example. Public agents can discover what exists here. Private clients can request a scoped pilot key. Owner-only tools can create, rewrite, translate, enrich and publish content. Those surfaces are intentionally separated.

Recommended reading

Recommended reading: AI Agent Systems for Growing Businesses

The business problem

Founders and operators do not need another chatbot tab. They need agent systems that can connect to content, SEO data, product data, APIs and internal workflows without turning every action into a security risk.

The practical goal is simple:

Let public agents understand the service.
Let approved companies test one scoped workflow.
Keep all write actions behind private authentication.
Preserve human review before publishing or changing anything important.
Make the technical surface clear enough for a CTO to audit.

The architecture pattern

The site uses a layered model instead of one open automation endpoint.

LayerPublic?Purpose
------:---
Agent cardYesDescribes the agent, capabilities and interfaces
API catalogYesLists machine-readable API metadata and docs
Public MCPYesRead-only site and blog discovery
Customer APIKey requiredRuns a scoped article or SEO pilot workflow
Owner MCPKey requiredManages content, research, publishing and SEO tools
Admin APIAdmin onlyDirect blog, settings and pipeline operations

The important part is not the acronym. The important part is the boundary. Discovery can be open. Business-changing actions should not be.

Public discovery

Public discovery lets humans and AI systems understand the site without credentials. This includes an agent card, an API catalog, an OpenAPI document, llms.txt, markdown rendering and a public read-only MCP endpoint.

That public layer answers questions like:

What does this site offer?
Which endpoints exist?
Which tools are read-only?
Where is the documentation?
How should an external agent request access?

It does not create posts. It does not run expensive AI jobs. It does not publish content.

Private pilot workflows

For a business, the first useful test should be narrow. The recommended pilot is an AI content and SEO workflow because it has a clear input, clear output and measurable business value.

A private pilot key can be scoped to a workflow like this:

Submit a topic, product, service, YouTube URL or content brief.
Run research and Search Console-informed recommendations.
Generate a draft article or SEO brief.
Return metadata, internal links, FAQ suggestions and image direction.
Save as draft only when the scope allows it.
Review the output before anything is published.

This gives founders a real test without handing the system broad access.

The MCP split

There are two MCP surfaces:

Public MCP: read-only discovery for site overview and published blog content.
Owner MCP: authenticated tools for content management, research, SEO, images, translation and publishing.

That split keeps the public surface useful while preventing anonymous write access. A public agent can inspect what the service does. It cannot modify the business.

The content pipeline

The internal pipeline is multi-step rather than one giant prompt. It can include Search Console intelligence, web research, topic validation, SEO strategy, writing, editor review, polishing, humanizing, image generation, YouTube discovery and publisher logic.

This structure makes the system easier to debug because each stage has a job:

Research validates the topic and search intent.
SEO plans the title, slug, internal links and content depth.
Writer creates the draft.
Editor checks quality and can request revisions.
Polish and humanizer agents clean the final text.
Publisher saves the draft with metadata.

For clients, this means the output is not just "AI text." It is a controlled workflow with review points.

Security rules that matter

The safest agent systems are boring in the right places. The implementation should avoid public write access, separate customer keys from owner keys, keep secrets out of source code, rate-limit customer workflows and log important actions.

A practical security checklist:

Use separate keys for customer pilots, owner MCP and admin APIs.
Give customer keys the narrowest useful scope.
Default content generation to draft mode.
Never expose service role keys in client code.
Keep destructive actions behind owner/admin authentication.
Make public tools read-only.
Document which surfaces are public and which require credentials.

Why this matters for e-commerce and growth teams

The same pattern works beyond articles. E-commerce teams can use it for product descriptions, category SEO, campaign drafts, reporting, product research and internal knowledge workflows. Technical teams can connect it to existing APIs instead of replacing the whole stack.

The value comes from starting small. One private workflow proves whether the agent can produce useful work, follow constraints and integrate with the real business.

Start with one workflow

The best first step is not a full autonomous company. It is one scoped workflow with a clear input, clear output and clear review process.

For most businesses, that means a private pilot key for AI content and SEO automation. Once that works, the same architecture can expand into e-commerce operations, reporting and custom internal workflows.

Recommended reading

Recommended reading: AI Agent Systems for Growing Businesses

FAQ

Should agent discovery be public?+
Discovery can be public when it only describes capabilities and read-only resources. Tool calls that create, update, delete, publish or run expensive AI jobs should require authentication.
What should a private pilot key allow?+
A pilot key should be scoped to one workflow, such as generating an SEO brief or draft article. It should not grant broad owner-level access.
Is MCP only useful for developers?+
No. MCP is technical infrastructure, but the business value is safer automation: agents can discover tools, call approved workflows and return structured results.

Recommended for you

Custom CRM CMS with Next.js and AI Agents in 2026

Custom CRM CMS with Next.js and AI Agents in 2026

How I built a custom CRM CMS with Next.js, Supabase, and AI agents to run 500+ posts, SEO workflows, and multilingual publishing.

18 min read
Search Console FAQ Schema: Build a PAA Feature Fast

Search Console FAQ Schema: Build a PAA Feature Fast

Turn real Google Search Console queries into a People Also Ask-style FAQ system with AI, review layers, and valid FAQPage schema.

24 min read
Automate 404 Redirects on Vercel with AI Agents

Automate 404 Redirects on Vercel with AI Agents

I used Vercel logs, Claude Code, and bulk 301 rules to automate 404 redirects after a WordPress-to-Next.js migration and protect rankings.

16 min read