AI & MCP Integration
Coolpie.ai documentation is designed to be natively accessible to AI assistants and LLM-powered developer tools — no scraping, no workarounds.
MCP Server
The Coolpie documentation MCP (Model Context Protocol) server lets AI assistants — primarily Claude — query the docs directly as part of a conversation. Instead of copy-pasting documentation into a chat, developers working on Coolpie integrations can have the AI look up the relevant sections automatically.
Endpoint
| Property | Value |
|---|---|
| Protocol | MCP over HTTP (Streamable HTTP transport) |
| Authentication | None — public endpoint |
| Content | Full text of all documentation pages, updated on every docs deploy |
| Infrastructure | Cloudflare Workers — edge-deployed, low latency globally |
How it works
The server bundles a pre-built index of all documentation pages (extracted from HTML at deploy time). When a tool is called, it searches or retrieves from this in-memory index — no database, no external calls, sub-100ms response times.
Available Tools
The MCP server exposes three tools. All tools are called via POST https://mcp.docs.coolpie.com/call with a JSON body {"tool": "...", "input": {...}}.
list_coolpie_pages
Returns a list of all available documentation pages with their slug, title, description, URL, and section (user-guide or technical).
get_coolpie_page
Returns the full text content of a specific documentation page.
slug (string, required) — page identifier, e.g. data-outputs, matching, price-suggestions
search_coolpie_docs
Full-text search across all documentation pages. Returns ranked results with a snippet showing the matching context and the URL of the relevant page.
query (string, required) — natural language or keyword search query
Example: search call
// Request
POST https://mcp.docs.coolpie.com/call
Content-Type: application/json
{
"tool": "search_coolpie_docs",
"input": { "query": "json export format" }
}
// Response
{
"result": [
{
"slug": "data-outputs",
"title": "Data Outputs",
"url": "https://docs.coolpie.com/data-outputs.html",
"snippet": "...Coolpie.ai exposes its competitive pricing data via a JSON export file and Google BigQuery. Each website has a dedicated JSON file regenerated on every price monitoring cycle...",
"score": 26
}
]
}
Example: get_coolpie_page call
// Request
POST https://mcp.docs.coolpie.com/call
Content-Type: application/json
{
"tool": "get_coolpie_page",
"input": { "slug": "matching" }
}
// Response
{
"result": {
"slug": "matching",
"title": "Matching",
"url": "https://docs.coolpie.com/matching.html",
"section": "user-guide",
"content": "... full page text ..."
}
}
Setup in Claude
Add the MCP server once in your Claude settings and it will be available in every conversation automatically.
Claude.ai (web)
- Go to Settings → Integrations → Add MCP Server
- Enter URL:
https://mcp.docs.coolpie.com - Name it Coolpie Docs and save
Claude Desktop / Claude Code
Add to your claude_desktop_config.json:
{
"mcpServers": {
"coolpie-docs": {
"url": "https://mcp.docs.coolpie.com"
}
}
}
Once connected, Claude can answer questions like "How does the JSON export work?" or "What fields are in a product entry?" by querying the docs directly — without you needing to paste anything into the chat.
llms.txt
For AI systems that use web crawling rather than MCP (ChatGPT, Gemini, Perplexity, etc.), the documentation index is available as a machine-readable llms.txt file at:
This follows the llms.txt standard — a plain-text index that tells LLM crawlers what pages exist, what they contain, and how they relate to each other. It is updated automatically on every documentation deploy.