← Back to Blog
insightsApr 2, 202610 min read

5 Ways to Connect Your AI Agent to Prediction Markets in 2026

Five integration approaches for connecting AI agents to prediction markets — from MCP servers to custom scrapers — with code examples and honest trade-offs.

SimpleFunctionsAI-generated
#ai-agents#mcp#prediction-markets#langchain#integration#tutorial

AI agents that can reason about the future need access to prediction market data. These markets represent the most efficient aggregation of crowd intelligence on upcoming events — from Fed rate decisions to geopolitical outcomes to tech milestones.

But how do you actually wire your agent to this data? Here are five approaches, ranked from highest-level abstraction to lowest, with honest assessments of each.


1. MCP Server (SimpleFunctions)

Best for: Claude Code, Cursor, and any MCP-compatible agent framework.

The Model Context Protocol (MCP) is Anthropic's open standard for connecting AI agents to external tools. SimpleFunctions exposes the full prediction market stack — scanning, querying, thesis management, edge detection, and trading — as an MCP server.

Setup

One command:

claude mcp add simplefunctions --url https://simplefunctions.dev/api/mcp/mcp

That's it. Your Claude Code agent now has access to 25+ tools.

What Your Agent Can Do

Agent: "What's the market saying about a recession?"

→ calls scan_markets("recession")
→ returns: KXRECSSNBER-26 trading at 37c (37% implied probability)
→ returns: Polymarket "US Recession 2026" at 34c
→ agent synthesizes: "Markets price recession risk at 34-37%.
   Kalshi is slightly higher than Polymarket."
Agent: "I think tariffs will cause a recession. Create a thesis."

→ calls create_thesis("Tariff escalation leads to US recession by Q4 2026")
→ system decomposes into causal tree
→ maps to 12 Kalshi/Polymarket contracts
→ identifies 3 edges where market disagrees with thesis
→ starts 24/7 monitoring

Pros

  • Zero plumbing: MCP handles the connection, serialization, and tool discovery.
  • Full context: not just prices — includes X/Twitter sentiment, news, traditional market data.
  • Stateful: thesis engine maintains state across sessions, so your agent doesn't start from scratch every conversation.
  • Trading capabilities: agent can check balances, view positions, and place orders (with your authorization).

Cons

  • Requires an MCP-compatible host (Claude Code, Cursor, or custom MCP client).
  • Thesis features need an API key (free scanning works without auth).

2. Direct Exchange APIs (Kalshi REST + Polymarket CLOB)

Best for: custom agent frameworks, Python-based systems, maximum control.

Both Kalshi and Polymarket offer well-documented REST APIs with official SDKs.

Kalshi Example

import httpx
import json

KALSHI_BASE = "https://api.kalshi.com/trade-api/v2"

# Authenticate
resp = httpx.post(f"{KALSHI_BASE}/login", json={
    "email": "you@example.com",
    "password": "your-password"
})
token = resp.json()["token"]
headers = {"Authorization": f"Bearer {token}"}

# Search markets
markets = httpx.get(
    f"{KALSHI_BASE}/markets",
    headers=headers,
    params={"status": "open", "series_ticker": "KXINX"}
).json()

for m in markets["markets"]:
    print(f"{m['ticker']}: yes={m['yes_ask']}c  vol={m['volume']}")

Polymarket Example

from py_clob_client.client import ClobClient

client = ClobClient(
    host="https://clob.polymarket.com",
    key="your-api-key",
    chain_id=137
)

markets = client.get_markets()
for m in markets:
    print(f"{m['question']}: {m['tokens'][0]['price']}")

Wrapping as an Agent Tool

from langchain.tools import tool

@tool
def search_kalshi(query: str) -> str:
    """Search Kalshi prediction markets for contracts matching the query."""
    resp = httpx.get(
        f"{KALSHI_BASE}/markets",
        headers=headers,
        params={"status": "open", "cursor": query}
    )
    markets = resp.json()["markets"][:10]
    return json.dumps([{
        "ticker": m["ticker"],
        "title": m["title"],
        "yes_ask": m["yes_ask"],
        "volume": m["volume"]
    } for m in markets])

Pros

  • Full control over every API parameter.
  • Direct order book access for trading.
  • No intermediary — lowest possible latency.
  • Official SDKs in Python, TypeScript, Rust.

Cons

  • You manage auth (Kalshi tokens expire every 30 minutes; Polymarket uses HMAC).
  • Single venue per integration — no cross-venue normalization.
  • No context enrichment — you get prices, not intelligence.
  • Rate limits and error handling are your responsibility.
  • Ongoing maintenance as APIs evolve.

3. Oddpool WebSocket + REST API

Best for: agents that need real-time streaming data, whale alerts, or arbitrage detection.

Oddpool aggregates data from Kalshi, Polymarket, and Opinion into a unified API. Its WebSocket feed is particularly useful for agents that need to react to market events in real time.

REST Example

import httpx

ODDPOOL_BASE = "https://api.oddpool.com/v1"
headers = {"Authorization": "Bearer op_your_key"}

# Search across venues
markets = httpx.get(
    f"{ODDPOOL_BASE}/markets/search",
    headers=headers,
    params={"q": "tariffs", "limit": 10}
).json()

for m in markets["results"]:
    print(f"[{m['venue']}] {m['title']}: {m['price']}c")

# Get whale trades
whales = httpx.get(
    f"{ODDPOOL_BASE}/whales/recent",
    headers=headers,
    params={"min_size": 1000, "limit": 20}
).json()

for w in whales["trades"]:
    print(f"${w['size']:,} {w['side']} on {w['market_title']} @ {w['price']}c")

WebSocket Example

import asyncio
import websockets
import json

async def stream_whales():
    uri = "wss://ws.oddpool.com/v1/stream"
    async with websockets.connect(uri) as ws:
        await ws.send(json.dumps({
            "type": "subscribe",
            "channels": ["whales", "arb_opportunities"],
            "auth": "op_your_key"
        }))
        async for message in ws:
            data = json.loads(message)
            if data["type"] == "whale_trade":
                print(f"Whale: ${data['size']:,} on {data['market']}")
            elif data["type"] == "arb_opportunity":
                print(f"Arb: {data['spread']}c spread on {data['market']}")

asyncio.run(stream_whales())

Pros

  • Cross-venue normalization — one API, all exchanges.
  • Real-time whale tracking and arbitrage detection built in.
  • WebSocket streaming for event-driven agent architectures.
  • Clean, well-documented API.

Cons

  • No reasoning or analysis layer — you get data, not insights.
  • No execution — read-only data, no order placement.
  • Paid tiers for the most useful features ($30-100/month).
  • Limited to prediction market data — no social, news, or traditional market enrichment.

4. LangChain / LangGraph Custom Tools

Best for: teams already invested in the LangChain ecosystem building multi-step agent workflows.

LangChain doesn't have a built-in prediction market integration, but its tool abstraction makes it straightforward to wrap any of the above APIs as agent tools.

Complete Agent Example

from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate
from langchain.tools import tool
import httpx

SF_BASE = "https://simplefunctions.dev/api"

@tool
def query_prediction_markets(question: str) -> str:
    """Query prediction markets for probabilities on future events.
    Returns live contract prices from Kalshi and Polymarket,
    X/Twitter sentiment, and an AI-synthesized answer."""
    resp = httpx.get(f"{SF_BASE}/public/query", params={
        "q": question,
        "mode": "full"
    })
    return resp.text

@tool
def scan_markets(keywords: str) -> str:
    """Search Kalshi and Polymarket for contracts matching keywords.
    Returns live bid/ask prices, volume, and liquidity."""
    resp = httpx.get(f"{SF_BASE}/public/scan", params={
        "q": keywords,
        "limit": 15
    })
    return resp.text

@tool
def get_market_context() -> str:
    """Get a global market intelligence snapshot including
    top edges, price movers, and traditional market data."""
    resp = httpx.get(f"{SF_BASE}/public/context")
    return resp.text

# Build the agent
llm = ChatOpenAI(model="gpt-4o")
tools = [query_prediction_markets, scan_markets, get_market_context]

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a prediction market analyst. Use tools to answer questions with live market data."),
    ("human", "{input}"),
    ("placeholder", "{agent_scratchpad}")
])

agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

result = executor.invoke({
    "input": "What are the odds of a Fed rate cut in June 2026? Compare Kalshi and Polymarket."
})
print(result["output"])

Pros

  • Integrates with the broader LangChain ecosystem (memory, RAG, multi-agent).
  • Flexible — wrap any data source as a tool.
  • Good for multi-step reasoning workflows (LangGraph).
  • Large community and documentation.

Cons

  • Boilerplate-heavy compared to MCP.
  • You're responsible for prompt engineering the tool descriptions.
  • No built-in state management for prediction market concepts (theses, edges).
  • LangChain abstraction layers can add latency and debugging complexity.
  • You're still just wrapping an API — the tool is only as good as the underlying data source.

5. Custom Scrapers + Selenium/Playwright

Best for: accessing data not available through official APIs (niche platforms, alternative signals).

Sometimes the data you need isn't available through any API. Custom scrapers fill the gap — but they come with significant trade-offs.

Example: Scraping Market Pages

from playwright.async_api import async_playwright
import asyncio

async def scrape_market_page(url: str) -> dict:
    async with async_playwright() as p:
        browser = await p.chromium.launch(headless=True)
        page = await browser.new_page()
        await page.goto(url, wait_until="networkidle")

        # Extract market data from rendered page
        title = await page.text_content("h1")
        price_el = await page.query_selector("[data-testid='price']")
        price = await price_el.text_content() if price_el else "N/A"

        # Extract recent trades from the DOM
        trades = await page.query_selector_all("[data-testid='trade-row']")
        trade_data = []
        for t in trades[:10]:
            trade_data.append({
                "size": await (await t.query_selector(".size")).text_content(),
                "price": await (await t.query_selector(".price")).text_content(),
            })

        await browser.close()
        return {"title": title, "price": price, "recent_trades": trade_data}

# Usage
data = asyncio.run(scrape_market_page("https://kalshi.com/markets/some-market"))

Pros

  • Access any data visible in a browser — no API required.
  • Can capture data from platforms without public APIs.
  • Useful for niche prediction platforms, social sentiment, or alternative data.

Cons

  • Fragile: DOM changes break your scraper. Ongoing maintenance is high.
  • Slow: browser rendering adds seconds of latency per request.
  • Terms of service: most platforms explicitly prohibit scraping. Legal risk.
  • Rate limiting: aggressive scraping gets your IP blocked.
  • No real-time: scraping is inherently polling-based. WebSocket feeds are always better for real-time data.
  • Not recommended for any platform that offers an official API.

Comparison Matrix

ApproachSetup TimeReal-TimeCross-VenueAI ContextTradingMaintenance
MCP (SimpleFunctions)1 minute15-min heartbeatYesYesYesNone
Direct Exchange APIHoursWebSocketNo (per venue)NoYesHigh
Oddpool API30 minWebSocketYesNoNoLow
LangChain tools1-2 hoursDepends on sourceDepends on sourceDependsDependsMedium
Custom scrapersDaysNo (polling)ManualNoNoVery High

Recommendation

If you're building an AI agent that needs prediction market access, start with the MCP approach (SimpleFunctions). It's the fastest path from zero to a working agent with full market access, and MCP is quickly becoming the standard protocol for agent-tool communication.

If you need real-time streaming data — especially whale tracking or arbitrage alerts — add Oddpool's WebSocket feed to your stack.

If you need full trading control with sub-second execution, go direct to the exchange API.

And if you're already deep in the LangChain ecosystem, wrapping SimpleFunctions' public REST API as a LangChain tool gives you the best of both worlds: LangChain's orchestration with SimpleFunctions' market intelligence.

Skip the custom scraper unless you have a very specific data source that truly isn't available through any API. The maintenance cost is rarely worth it.

The prediction market data landscape is evolving fast. The right integration choice today might change in six months. Build with abstractions that let you swap layers without rewriting your agent from scratch.

5 Ways to Connect Your AI Agent to Prediction Markets in 2026 | SimpleFunctions