by fulcradynamics
Provides an MCP server that connects to Fulcra Context data through the fulcra-api library, handling OAuth2 flows and supporting both local stdio transport and remote Streamable HTTP transport.
Fulcra Context MCP exposes Fulcra Context data via the Model Context Protocol (MCP). It wraps the fulcra-api
Python client, manages OAuth2 token exchange internally, and offers a standardized MCP interface for downstream tools.
Local setup – Run the server locally with the stdio transport:
npx -y fulcra-context-mcp@latest
The server will start and listen for MCP requests over standard input/output.
Remote connection – Use the public instance at https://mcp.fulcradynamics.com/mcp
or proxy through mcp-remote
:
{
"mcpServers": {
"fulcra_context": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.fulcradynamics.com/mcp"]
}
}
}
Configuration – No additional environment variables are required for basic operation; the server handles OAuth2 callbacks internally.
fulcra-api
– Leverages the official Fulcra Python client for all API calls.mcp-remote
for troubleshooting.https://mcp.fulcradynamics.com/mcp
.Q: Do I need to provide my own API credentials? A: The server handles OAuth2 flows; you only need to authenticate via the provided login flow when prompted.
Q: Can I run the server in a Docker container?
A: Yes – simply invoke the same npx
command inside a container that has Node.js and Python installed.
Q: What transport should I choose? A: Use stdio for local development and Streamable HTTP when deploying behind a network or using the public instance.
Q: How do I debug connection issues?
A: Leverage the MCP Inspector UI or the mcp-remote
CLI, which provide verbose logs and request tracing.
Q: Where can I report bugs or request features? A: Open an issue on the GitHub repository or join the Discord community linked in the README.
This is an MCP server that provides tools and resources to call the Fulcra API using fulcra-api
.
There is a public instance of this server running at https://mcp.fulcradynamics.com/mcp
. See https://fulcradynamics.github.io/developer-docs/mcp-server/ to get started quickly. This repo is primarily for users who need to run the server locally, want to see under the hood, or want to help contribute.
When run on its own (or when FULCRA_ENVIRONMENT
is set to stdio
), it acts as a local MCP server using the stdio transport. Otherwise, it acts as a remote server using the Streamble HTTP transport. It handles the OAuth2 callback, but doesn't leak the exchanged tokens to MCP clients. Instead, it maintains a mapping table and runs its own OAuth2 service between MCP clients.
Claude for Desktop config:
{
"mcpServers": {
"fulcra_context": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://mcp.fulcradynamics.com/mcp"
]
}
}
}
Similar config using uvx
:
{
"mcpServers": {
"fulcra_context": {
"command": "uvx",
"args": [
"fulcra-context-mcp@latest"
]
}
}
}
Please feel free to reach out via the GitHub repo for this project or join our Discord to reach out directly. Email also works (support@fulcradynamics.com
).
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "fulcra-context-mcp": { "command": "npx", "args": [ "-y", "fulcra-context-mcp@latest" ], "env": {} } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.