by GongRzhe
Provides a simple interface to query documents through a Langflow backend, enabling document Q&A via the Model Context Protocol.
Langflow Doc QA Server is a TypeScript‑based MCP server that forwards natural‑language queries to a Langflow Document Q&A flow and returns the generated answers. It acts as a bridge between Claude (or any MCP‑compatible client) and a Langflow workflow that handles document retrieval, language model inference, and response formatting.
http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false
).API_ENDPOINT
environment variable to the copied URL.npm install
npm run build # produces build/index.js
node build/index.js
For Claude Desktop, add the following entry to claude_desktop_config.json
(adjust the path and endpoint as needed):
{
"mcpServers": {
"langflow-doc-qa-server": {
"command": "node",
"args": ["/path/to/doc-qa-server/build/index.js"],
"env": { "API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false" }
}
}
}
query_docs
tool: Accepts a plain text query and returns the answer generated by the Langflow backend.API_ENDPOINT
variable is required; defaults to a sample local endpoint.Q: Do I need a Langflow license? A: No, the server works with any locally running Langflow instance; a free community edition is sufficient.
Q: Which LLM models are supported? A: Whatever model you configure inside the Langflow flow (OpenAI, Anthropic, locally hosted models, etc.).
Q: Can I run the server in production?
A: Yes, but ensure the Langflow service is secured and the API_ENDPOINT
points to a protected URL. Use HTTPS and authentication if needed.
Q: How do I debug request failures?
A: Run npm run inspector
to launch the MCP Inspector, which provides a web UI showing stdin/stdout exchange and error details.
Q: Is there a Docker image? A: Not provided in the repository, but you can containerize it by installing dependencies and running the built entry point.
A Model Context Protocol server for document Q&A powered by Langflow
This is a TypeScript-based MCP server that implements a document Q&A system. It demonstrates core MCP concepts by providing a simple interface to query documents through a Langflow backend.
http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false
API_ENDPOINT
configurationquery_docs
- Query the document Q&A system
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"langflow-doc-qa-server": {
"command": "node",
"args": [
"/path/to/doc-qa-server/build/index.js"
],
"env": {
"API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35fac"
}
}
}
}
To install Document Q&A Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @GongRzhe/Langflow-DOC-QA-SERVER --client claude
The server supports the following environment variables for configuration:
API_ENDPOINT
: The endpoint URL for the Langflow API service. Defaults to http://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35fac
if not specified.Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
This project is licensed under the MIT License.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "langflow-doc-qa-server": { "command": "node", "args": [ "/path/to/doc-qa-server/build/index.js" ], "env": { "API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
by liuyoshio
Provides natural‑language search and recommendation for Model Context Protocol servers, delivering rich metadata and real‑time updates.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by fatwang2
Provides web and news search, URL crawling, sitemap extraction, deep‑reasoning, and trending topic retrieval via Search1API, exposed as an MCP server for integration with AI clients.
by cnych
Provides SEO data retrieval via Ahrefs, exposing MCP tools for backlink analysis, keyword generation, traffic estimation, and keyword difficulty, with automated CAPTCHA solving and response caching.