LLM Brand Monitor
Track how 350+ AI models mention your brand — manage projects, run scans, analyze results
Install LLM Brand Monitor in your MCP client
LLM Brand Monitor is a Model Context Protocol server. Add it to your MCP client config once, restart, and the server's tools become available to your AI assistant. The same JSON snippet below works across all four major clients — only the config file path differs.
- Locate your client's MCP config file.
- Cursor:
~/.cursor/mcp.json - Claude Desktop (macOS):
~/Library/Application Support/Claude/claude_desktop_config.json - Claude Desktop (Windows):
%APPDATA%\Claude\claude_desktop_config.json - VS Code:
Settings → Extensions → MCP - Windsurf:
Settings → MCP Servers
- Cursor:
- Add LLM Brand Monitor to the mcpServers map — paste the snippet below into your config file. If you already have other MCP servers, merge the entry into the existing
mcpServersobject. - Restart your client so it picks up the new server.
- Verify — ask the assistant to list available tools; LLM Brand Monitor's tools should appear.
{
"mcpServers": {
"llm-brand-monitor": {
"command": "npx",
"args": [
"-y",
"@serpstat/llm-brand-monitor-mcp"
],
"env": {
"LBM_API_KEY": "<your-lbm-api-key>"
}
}
}
}Install LLM Brand Monitor in Cursor
Open ~/.cursor/mcp.json in your editor, paste the snippet above into mcpServers, save, and restart Cursor. LLM Brand Monitor will show up in the assistant's tool list on next launch.
Install LLM Brand Monitor in Claude Desktop (macOS)
Open ~/Library/Application Support/Claude/claude_desktop_config.json in your editor, paste the snippet above into mcpServers, save, and restart Claude Desktop (macOS). LLM Brand Monitor will show up in the assistant's tool list on next launch.
Install LLM Brand Monitor in Claude Desktop (Windows)
Open %APPDATA%\Claude\claude_desktop_config.json in your editor, paste the snippet above into mcpServers, save, and restart Claude Desktop (Windows). LLM Brand Monitor will show up in the assistant's tool list on next launch.
Install LLM Brand Monitor in VS Code
Open Settings → Extensions → MCP in your editor, paste the snippet above into mcpServers, save, and restart VS Code. LLM Brand Monitor will show up in the assistant's tool list on next launch.
npm package: @serpstat/llm-brand-monitor-mcp
Required environment variables
LLM Brand Monitor needs the following environment variables set before it can run. Add them to the env block of your mcpServers entry, or export them in your shell before launching the client.
LBM_API_KEY
Transport
LLM Brand Monitor supports the following MCP transports. Most AI clients use stdio by default for locally-installed servers.
More build.protocol_tooling.mcp MCP servers
Other Model Context Protocol servers in the same space as LLM Brand Monitor. Each one adds different capabilities to your AI assistant — pick based on the data sources or workflows you need.
nara-tour
나라투어 공식 MCP 서버 — 수원 소재 단체여행 전문 여행사. AI 에이전트에게 회사 프로필, 영업 정보, 카카오톡 상담 링크(UTM 포함)를 제공.
suprsonic-mcp
One API key, dozens of capabilities for your AI agent. Zero provider auth.
cxg-census-mcp
MCP server for the CZ CELLxGENE Census. Single-cell, ontology-aware. Community, unaffiliated.
agent-memory
Persistent, agent-owned memory with encrypted storage and shared knowledge commons.
Haiku DeFi MCP
DeFi execution for AI agents — swap, lend, bridge, and yield across 22 chains.
m2mcent
x402 payment wrapper for AI Agents and MCP Servers. USDC settlements on Base L2.
dep-diff-mcp
Translates a lockfile diff into a human-readable upgrade plan for npm and PyPI.
Arezzo
Deterministic Google Docs API compiler. Your agent stops silently corrupting documents.
Browse the full MCP server directory or use Stork's one-line install to let your agent pick the right server automatically.