by cbinsights
Provides an interface for developers to interact with CB Insights ChatCBI LLM through AI agents.
The server exposes a ChatCBI tool that lets agents send messages to the CB Insights LLM and receive structured responses, including chat IDs, related content, sources, and suggested prompts.
.env template and set the required environment variables:
CBI_CLIENT_ID and CBI_CLIENT_SECRET for API authentication.CBI_MCP_PORT (default 8000) to change the listening port.CBI_MCP_TIMEOUT to adjust request timeout.uv run server.py.mcp install server.py; this writes a claude_desktop_config.json entry that points to the uv command.mcp dev server.py.chatID; receive response details, related content, sources, and suggestions..env.chatID) is required.Q: Which Python version is needed?
A: Any version supported by uv; typically Python 3.9+.
Q: How do I obtain the client ID and secret? A: Register an application on the CB Insights developer portal and follow the authentication docs linked in the README.
Q: Can I change the server’s host address?
A: Yes, set the HOST variable in the .env file (defaults to 0.0.0.0).
Q: What happens if I don’t provide a chatID?
A: The server creates a new ChatCBI session and returns a fresh chatID in the response.
Q: Is there a way to view logs?
A: Run the server with the --log-level DEBUG flag (or set LOG_LEVEL in .env).
The CBI MCP Server provides an interface for developers to interact with CB Insights ChatCBI LLM through AI Agents.
Send a message from an agent to ChatCBI and return the response.
message: The content of your message to ChatCBIchatID(optional): A unique identifier for the chat session, obtained from a previous response. If included, the conversation is continued. Otherwise, a new conversation is started.chatID: Identifies the conversation. If chatID was provided in the request, this will be the same.message: ChatCBI response to the message.relatedContent: List of related references.sources: List of sources used to generate the response.suggestions: List of suggested follow-up questions.title: Title of the chatThe CBI MCP Server uses uv to manage the project.
Environment variables are set via the .env file:
CBI_CLIENT_ID & CBI_CLIENT_SECRET OAuth Client ID and Secret
CBI_MCP_TIMEOUT (default: 120)CBI_MCP_PORT (default: 8000)Update the claude_desktop_config.json file using the following command:
mcp install server.py
This will add the following configuration:
{
"mcpServers": {
"cbi-mcp-server": {
"command": "/path/to/.local/bin/uv",
"args": [
"--directory",
"/path/to/cloned/cbi-mcp-server",
"run",
"server.py"
]
}
}
}
The inspector can be used to test/debug your server.
mcp dev server.py
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by Klavis-AI
Provides production‑ready MCP servers and a hosted service for integrating AI applications with over 50 third‑party services via standardized APIs, OAuth, and easy Docker or hosted deployment.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.