How MCP Works
The Model Context Protocol contract that lets Claude, ChatGPT, Gemini, and any MCP client invoke Parlay's typed prediction-market tools.
The Model Context Protocol (MCP) is the contract that lets any AI client — Claude, ChatGPT, Gemini, OpenClaw — talk to Parlay's prediction-market tools without an SDK. Hit the Parlay endpoint and the server returns a descriptor that tells the client what tools, resources, and prompts are available, then the client invokes them with typed arguments and gets typed JSON back.
Why this matters
Connect once and every MCP-aware client can research Polymarket, Kalshi, and Manifold the same way. No SDK to install, no per-venue glue code, no scraping — every call hits live data.
Why MCP for prediction markets
Connect once, work everywhere
One MCP endpoint serves Claude, ChatGPT, Gemini CLI, OpenClaw, and any future client that speaks the protocol.
Typed tools, not screen-scraping
Every market query has a JSON schema. Models pass real arguments, get real fields back — no hallucinated columns.
Live data, not training memories
Each tool call hits the venue in real time. Odds, depth, and trades are current, not last-quarter snapshots.
The MCP descriptor
When a client connects, the server returns a descriptor describing itself and what it exposes. Parlay's looks like this:
{
"server": {
"name": "Parlay",
"version": "1.0.0",
"transport": "http"
},
"capabilities": {
"tools": {
"search_markets": {
"description": "Search active markets across Polymarket, Kalshi, and Manifold by keyword, category, or close date.",
"inputSchema": { "type": "object", "properties": { "query": { "type": "string" }, "venue": { "type": "string" } }, "required": ["query"] }
},
"get_quote": {
"description": "Return the current yes/no implied probability for a specific market across one or more venues.",
"inputSchema": { "type": "object", "properties": { "market_id": { "type": "string" } }, "required": ["market_id"] }
},
"get_orderbook": {
"description": "Return the live order-book depth for a market, useful for spotting liquidity and slippage.",
"inputSchema": { "type": "object", "properties": { "market_id": { "type": "string" }, "depth": { "type": "integer" } }, "required": ["market_id"] }
},
"get_trades": {
"description": "Return recent fills on a market for momentum and volume analysis.",
"inputSchema": { "type": "object", "properties": { "market_id": { "type": "string" }, "limit": { "type": "integer" } }, "required": ["market_id"] }
}
},
"resources": [
{
"uri": "parlay://venues",
"name": "venues",
"description": "Catalog of supported venues with status, regulatory jurisdiction, and supported tools.",
"mimeType": "application/json"
}
],
"prompts": []
}
}Three primitives carry every MCP server:
Tools
Functions the model can invoke with typed arguments. Parlay exposes search, quotes, order-book depth, and trade history today; trading lands next.
Resources
Static or near-static reference data the client can fetch by URI. Parlay ships a venues catalog at parlay://venues.
Prompts
Optional reusable prompt templates servers can advertise. Parlay's list is empty today — clients drive the interaction.
How a request flows
Connect to the endpoint
The client opens the MCP transport (HTTP for hosted Parlay, stdio for local CLI agents) and authenticates with the API key from your dashboard.
Server announces capabilities
Parlay returns the descriptor above. The client now knows the tool names, argument shapes, and response types — without any extra documentation.
Client invokes a tool
The model picks a tool, fills in arguments that match the schema, and sends the call. The protocol validates types before the request reaches Parlay's venue clients.
Server returns typed JSON
Parlay routes the call to the right venue (Polymarket, Kalshi, or Manifold), normalizes the response, and returns typed JSON the model can reason about directly.
Transports
Parlay speaks two transports. Hosted clients use HTTP; local CLI agents use stdio.
Most clients — Claude, ChatGPT, Gemini, Cursor — connect over HTTP with an API key.
https://parlay.run/mcp
Authorization: Bearer <your-api-key>Paste the URL into the client's MCP connector settings, sign in, and the descriptor handshake happens automatically.
CLI agents like OpenClaw can spawn Parlay as a local subprocess and pipe MCP messages over stdio:
npx -y @parlay/mcp@latestThe CLI reads PARLAY_API_KEY from your environment. Use this when you want zero network hops between agent and server.
Auth differs per transport
HTTP uses bearer tokens scoped to your dashboard account. stdio inherits
the shell environment, so keep PARLAY_API_KEY out of committed files.
Tool example
What a real call looks like on the wire — request and response are both typed JSON.
{
"tool": "get_quote",
"arguments": {
"market_id": "polymarket:will-fed-cut-rates-in-june-2026"
}
}{
"venue": "polymarket",
"market_id": "polymarket:will-fed-cut-rates-in-june-2026",
"yes_price": 0.62,
"no_price": 0.39,
"implied_probability": 0.62,
"last_trade_at": "2026-04-27T11:42:18Z"
}{
"tool": "get_orderbook",
"arguments": {
"market_id": "kalshi:FED-25JUN-CUT",
"depth": 5
}
}{
"venue": "kalshi",
"market_id": "kalshi:FED-25JUN-CUT",
"yes": [
{ "price": 0.61, "size": 1200 },
{ "price": 0.60, "size": 3400 }
],
"no": [
{ "price": 0.40, "size": 980 },
{ "price": 0.41, "size": 2100 }
],
"as_of": "2026-04-27T11:42:18Z"
}