Bring your own AI

Use the AI you already pay for.

No new subscription. No in-app tokens. Paste our MCP endpoint into Claude Desktop, ChatGPT, Cursor, Claude Code, or Windsurf — that AI now builds live dashboards, runs queries, and publishes artifacts into your DataUnmess workspace.

No API tokensFlagship models (Opus, GPT-5, Gemini)Switch vendors anytime

Why not just Claude.ai or ChatGPT?

Here's what they can't do.

Why you still want DataUnmess sitting behind whichever AI you use.

Claude.ai / ChatGPT
Frozen React, fake data
DataUnmess
Live queries on your real warehouse
Claude.ai / ChatGPT
Stuck inside one conversation
DataUnmess
Persistent, team-shared workspace
Claude.ai / ChatGPT
Locked to one vendor
DataUnmess
Any MCP client — switch whenever
Claude.ai / ChatGPT
One user, one chat
DataUnmess
Roles, invites, shared ownership

The easy way — for AI agents

One-prompt setup

Open Claude Desktop, Claude Code, Cursor, Windsurf, or any AI interface and paste this:

I want to install the DataUnmess MCP server (my own endpoint at https://mcp-dev.dataunmess.ai/mcp) into this AI client. Fetch https://www.dataunmess.ai/connect-mcp for the exact config snippet, then add it to this client's MCP config. Use the literal string <YOUR_MCP_KEY> as the bearer token — keeping the real key out of this chat is intentional, I'll paste it in myself after you save the file.

The AI fetches this page, edits the right config file with a <YOUR_MCP_KEY> placeholder, then hands off to you to paste the real key. Keeping the key out of the chat keeps it out of any logs the AI client sends to its provider.

Prefer to do it yourself? Three steps.

The manual path — if you'd rather not delegate to the AI.

1

Generate your MCP key

Visit /account/mcp-server in DataUnmess, click New MCP key, name it (e.g. "Claude Desktop"), and copy the key. You won't see it again — store it somewhere safe.

2

Add DataUnmess to your AI client

Pick your client, find its MCP config file, and add the dash-ai entry. Replace <YOUR_API_KEY> with the key from step 1.

Claude Desktop

Config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Add this entry:

{
  "mcpServers": {
    "dataunmess": {
      "url": "https://mcp-dev.dataunmess.ai/mcp",
      "headers": {
        "Authorization": "Bearer <YOUR_API_KEY>"
      }
    }
  }
}

Claude Code

Config file:

  • CLI: claude mcp add dataunmess --url https://mcp-dev.dataunmess.ai/mcp
  • or edit: ~/.claude/settings.json

Add this entry:

{
  "mcpServers": {
    "dataunmess": {
      "url": "https://mcp-dev.dataunmess.ai/mcp",
      "headers": {
        "Authorization": "Bearer <YOUR_API_KEY>"
      }
    }
  }
}

Cursor

Config file:

  • All OS: ~/.cursor/mcp.json

Add this entry:

{
  "mcpServers": {
    "dataunmess": {
      "url": "https://mcp-dev.dataunmess.ai/mcp",
      "headers": {
        "Authorization": "Bearer <YOUR_API_KEY>"
      }
    }
  }
}

Windsurf / Codeium

Config file:

  • All OS: ~/.codeium/windsurf/mcp_config.json

Add this entry:

{
  "mcpServers": {
    "dataunmess": {
      "url": "https://mcp-dev.dataunmess.ai/mcp",
      "headers": {
        "Authorization": "Bearer <YOUR_API_KEY>"
      }
    }
  }
}

After editing the config, restart the AI client so it picks up the new server.

3

Try it

In your AI interface, send any of these:

>List my DataUnmess datasets
>In DataUnmess, query the payments dataset and aggregate revenue by payment_method
>With c:\tmp\sales.csv, create a DataUnmess bar chart of revenue by region
>Clean my Q1 leads Google Sheet — dedupe by email and split full_name into first / last
>Swap the Q1 Revenue dashboard to use the cleaned-leads-v2 dataset

The AI will call DataUnmess's MCP tools, fetch data, and return a link to the generated dashboard.

Don't want to wire it up yourself?

Hire our onboarding setup

We plan and execute the data integrations, then build the insights and artifacts for you — dashboards, flows, the works.

Get a quote
query_data

Filter, group, and aggregate datasets (CSV, Excel, uploads). Returns result rows as JSON.

build_chart

Generate a chart spec from queried data. 22 chart types: bar, line, donut, scatter, treemap, funnel, radar, sankey, heatmap, waterfall, gantt, and more.

build_analysis

Create an analysis panel with text summary and KPI highlight cards showing metrics and trends.

query_connection

Execute a SELECT SQL query against a connected database (PostgreSQL, MySQL). Read-only — write queries are blocked.

query_github

Browse a connected GitHub repo: tree, file contents, languages, code search.

query_google_sheets

Read a connected Google Sheets / Drive source. Actions: list_spreadsheets, get_metadata, read_range, read_sheet_as_table. Read-only OAuth scopes.

build_flowchart

Build a flowchart with 8 node shapes (rect, diamond, ellipse, hexagon, parallelogram, cylinder, cloud, document), optional lucide icons per node, and labeled edges. Decision diamonds with multiple outgoing edges auto-fan into parallel lanes. Accepts structured nodes+edges JSON or a Mermaid `flowchart TD|LR` source block.

Four resources are also exposed: dash-ai://datasets, dash-ai://datasets/{id}, dash-ai://connections, and dash-ai://connections/{id}. Read dash-ai://connections first to discover available data sources before calling query_* tools.

Questions or stuck?

Open an issue on GitHub or check the user guide.