by cbinsights
Provides an interface for developers to interact with CB Insights ChatCBI LLM through AI agents.
The server exposes a ChatCBI tool that lets agents send messages to the CB Insights LLM and receive structured responses, including chat IDs, related content, sources, and suggested prompts.
.env
template and set the required environment variables:
CBI_CLIENT_ID
and CBI_CLIENT_SECRET
for API authentication.CBI_MCP_PORT
(default 8000
) to change the listening port.CBI_MCP_TIMEOUT
to adjust request timeout.uv run server.py
.mcp install server.py
; this writes a claude_desktop_config.json
entry that points to the uv command.mcp dev server.py
.chatID
; receive response details, related content, sources, and suggestions..env
.chatID
) is required.Q: Which Python version is needed?
A: Any version supported by uv
; typically Python 3.9+.
Q: How do I obtain the client ID and secret? A: Register an application on the CB Insights developer portal and follow the authentication docs linked in the README.
Q: Can I change the server’s host address?
A: Yes, set the HOST
variable in the .env
file (defaults to 0.0.0.0
).
Q: What happens if I don’t provide a chatID
?
A: The server creates a new ChatCBI session and returns a fresh chatID
in the response.
Q: Is there a way to view logs?
A: Run the server with the --log-level DEBUG
flag (or set LOG_LEVEL
in .env
).
The CBI MCP Server provides an interface for developers to interact with CB Insights ChatCBI LLM through AI Agents.
message
:chatID
: (optional) The unique id of an existing ChatCBI session. Used for continuity in a conversation. If not provided, a new ChatCBI session will be createdchatID
: Unique id of current ChatCBI sessionmessage
: ChatCBI message generated in response to the message send in the input.RelatedContent
: Content that is related to the content returnedSources
: Supporting sources for the message content returnedSuggestions
Suggested prompts to further explore the subject matterThe CBI MCP Server uses uv to manage the project.
The default port is 8000
, but can be modified by updating the CBI_MCP_PORT
environment variable in the .env
file.
The timeout for requests can also be modified via the CBI_MCP_TIMEOUT
variable in the .env
file.
Documentation on how CB Insights APIs are authenticated can be found here
The server uses the CBI_CLIENT_ID
and CBI_CLIENT_SECRET
environment variables set in the .env
file to authorize requests.
Update the claude_desktop_config.json
file using the following command:
mcp install server.py
This will add the following configuration:
{
"mcpServers": {
"cbi-mcp-server": {
"command": "/path/to/.local/bin/uv",
"args": [
"--directory",
"/path/to/cloned/cbi-mcp-server",
"run",
"server.py"
]
}
}
}
The inspector can be used to test/debug your server.
mcp dev server.py
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.
by gptme
Provides a personal AI assistant that runs directly in the terminal, capable of executing code, manipulating files, browsing the web, using vision, and interfacing with various LLM providers.