- “How many invoices did we issue last week?”
- “Create a new customer called ACME Inc. with address 123 Main St”
Getting started
Run the server locally with npx:list_api_endpoints → get_api_endpoint_schema → invoke_api_endpoint) to fulfil your requests.
Which client should I use?
- Cursor – best for developers working inside a repo/IDE with inline tool usage and project/global config files.
- Claude Desktop – great general‑purpose client on macOS/Windows; easy JSON config and strong tool orchestration.
- Claude (web) – use when you want quick remote access via a URL without local config files.
- ChatGPT (Connectors) – available in certain plans; currently expects
search/fetchtools and may need a wrapper for full compatibility.
Client setup
Cursor
Cursor supports MCP via project and global config files and works with stdio or remote HTTP/SSE transports. Config files- Project:
.cursor/mcp.json - Global:
~/.cursor/mcp.json
.cursor/mcp.json
~/.cursor/mcp.json
npx -y conductor-node-mcp --transport=http --port=3000. If your Cursor build supports auth headers in mcp.json, pass an Authorization bearer token or use the alternative header x-conductor-secret-key. Otherwise, prefer stdio or a trusted local network. See: Cursor Docs: Model Context Protocol.
Claude Desktop
Local (stdio) configurationclaude_desktop_config.json
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\\Claude\\claude_desktop_config.json
Claude (web)
Claude on the web supports connecting to remote MCP servers. Run Conductor in Remote mode (HTTP) and add it from Claude’s integrations/settings UI, then approve tool calls as needed. See: MCP Clients – modelcontextprotocol.ioChatGPT (Connectors)
ChatGPT supports “Custom Connectors (MCP)” in Settings → Connectors. See: OpenAI Help: Connectors in ChatGPT Compatibility note- ChatGPT may show the error “This MCP server doesn’t implement our specification” if required tools are missing. As of September 2025, Connectors expect servers to implement tools named
searchandfetch. Our Conductor MCP server exposes explicit tools per endpoint or dynamic tools such aslist_api_endpoints/invoke_api_endpoint, so it won’t directly satisfy that requirement without a thin wrapper server that providessearch/fetchand delegates to Conductor. See the troubleshooting section in the help article for details.
Advanced options
Remote mode (HTTP)
Run the MCP server as a remote server using Streamable HTTP transport:- Auth via header
Authorization: Bearer <token>orx-conductor-secret-key: <token>. - You can also use
--socketto bind to a Unix socket.
http://localhost:3000?resource=cards&resource=accounts&no_tool=create_cardshttp://localhost:3000?client=cursor&capability=tool-name-length%3D40
Choosing the tool style
You have two options when starting the server:- Explicit tools – one MCP tool per Conductor endpoint. Useful when you know exactly which operations you need and want the most accurate parameter suggestions.
- Dynamic tools (
--tools=dynamic) – three generic tools that let the LLM search, inspect, and invoke any endpoint on demand. Helpful when you want the entire API surface in a compact form.
--tool=<name>– include a specific tool by name--resource=<pattern>– include all tools under a resource (supports wildcards, e.g.qbd.invoices*)--operation=read|write– include just read or write operations--tag=<tag>– include tools by tag- Exclusions available via
--no-tool,--no-resource, etc. - Use
--listto preview which tools will be exposed
Capabilities & clients
Different LLM clients have different schema limitations. Pass--client=<type> so the MCP server tailors its output accordingly. Supported values: openai-agents, claude, claude-code, cursor.
Fine-tune capabilities with --capability flags (comma-separated or repeated):
top-level-unions– enable top-level unions in tool schemasvalid-json– allow JSON string parsing for argumentsrefs– enable$refpointers in schemasunions– enableanyOfunion typesformats– enable format validations likedate-timetool-name-length=N– set maximum tool name length

