by kumo-ai
Build, manage, and query relational graphs from CSV/Parquet, convert natural language to PQL, and obtain predictions, evaluations, and explanations from KumoRFM without any training.
KumoRFM MCP Server enables AI assistants and developer tools to interact with the pre‑trained KumoRFM relational foundation model. It turns tabular data into a heterogeneous graph, accepts Predictive Query Language (PQL) statements—generated automatically from natural language—and returns model predictions, performance metrics, or explanations.
pip install kumo-rfm-mcp
mcp_config.json):
{
"mcpServers": {
"kumo-rfm": {
"command": "python",
"args": ["-m", "kumo_rfm_mcp.server"],
"env": {"KUMO_API_KEY": "<YOUR_API_KEY>"}
}
}
}
find_table_files, inspect_graph_metadata, predict, evaluate, explain, etc.) within any agentic workflow (CrewwAI, LangChain, OpenAI agents, Claude Code SDK, etc.).predict for inference, evaluate for metric calculation, and explain for model interpretability.pip and start it with the provided command.KUMO_API_KEY in the environment; if omitted, an interactive OAuth2 flow is triggered on first request.🔬 MCP server to query KumoRFM in your agentic flows
KumoRFM is a pre-trained Relational Foundation Model (RFM) that generates training-free predictions on any relational multi-table data by interpreting the data as a (temporal) heterogeneous graph. It can be queried via the Predictive Query Language (PQL).
This repository hosts a full-featured MCP (Model Context Protocol) server that empowers AI assistants with KumoRFM intelligence. This server enables:
The KumoRFM MCP server is available for Python 3.10 and above. To install, simply run:
pip install kumo-rfm-mcp
Add to your MCP configuration file (e.g., Claude Desktop's mcp_config.json):
{
"mcpServers": {
"kumo-rfm": {
"command": "python",
"args": ["-m", "kumo_rfm_mcp.server"],
"env": {
"KUMO_API_KEY": "your_api_key_here"
}
}
}
}
We provide a single-click installation via our MCP Bundle (MCPB) (e.g., for integration into Claude Desktop):
dxt file from hereThe MCP Bundle supports Linux, macOS and Windows, but requires a Python executable to be found in order to create a separate new virtual environment.
See here for the transcript.
https://github.com/user-attachments/assets/56192b0b-d9df-425f-9c10-8517c754420f
You can use the KumoRFM MCP directly in your agentic workflows:
Browse our examples to get started with agentic workflows powered by KumoRFM.
find_table_files - Searching for tabular files: Find all table-like files (e.g., CSV, Parquet) in a directory.inspect_table_files - Analyzing table structure: Inspect the first rows of table-like files.inspect_graph_metadata - Reviewing graph schema: Inspect the current graph metadata.update_graph_metadata - Updating graph schema: Partially update the current graph metadata.get_mermaid - Creating graph diagram: Return the graph as a Mermaid entity relationship diagram.materialize_graph - Assembling graph: Materialize the graph based on the current state of the graph metadata to make it available for inference operations.lookup_table_rows - Retrieving table entries: Lookup rows in the raw data frame of a table for a list of primary keys.predict - Running predictive query: Execute a predictive query and return model predictions.evaluate - Evaluating predictive query: Evaluate a predictive query and return performance metrics which compares predictions against known ground-truth labels from historical examples.explain - Explaining prediction: Execute a predictive query and explain the model prediction.KUMO_API_KEY: Authentication is needed once before predicting or evaluating with the
KumoRFM model.
You can generate your KumoRFM API key for free here.
If not set, you can also authenticate on-the-fly in individual session via an OAuth2 flow.As you work with KumoRFM, if you encounter any problems or things that are confusing or don't work quite right, please open a new :octocat:issue. You can also submit general feedback and suggestions here. Join our Slack!
Please log in to share your review and rating for this MCP.
{
"mcpServers": {
"kumo-rfm": {
"command": "python",
"args": [
"-m",
"kumo_rfm_mcp.server"
],
"env": {
"KUMO_API_KEY": "<YOUR_API_KEY>"
}
}
}
}claude mcp add kumo-rfm python -m kumo_rfm_mcp.serverExplore related MCPs that share similar capabilities and solve comparable challenges
by DMontgomery40
A Model Context Protocol server that proxies DeepSeek's language models, enabling seamless integration with MCP‑compatible applications.
by deepfates
Runs Replicate models through the Model Context Protocol, exposing tools for model discovery, prediction management, and image handling via a simple CLI interface.
by 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
by ruixingshi
Provides Deepseek model's chain‑of‑thought reasoning to MCP‑enabled AI clients, supporting both OpenAI API mode and local Ollama mode.
by groundlight
Expose HuggingFace zero‑shot object detection models as tools for large language or vision‑language models, enabling object localisation and zoom functionality on images.
by 66julienmartin
Provides a Model Context Protocol server for the Qwen Max language model, enabling seamless integration with Claude Desktop and other MCP‑compatible clients.
by Verodat
Enables AI models to interact with Verodat's data management capabilities through a set of standardized tools for retrieving, creating, and managing datasets.
Run advanced AI models locally with high performance while maintaining full data privacy, accessible through native desktop applications and a browser‑based platform.
Upload, analyze, and visualize documents, compare multiple AI model responses side‑by‑side, generate diagrams, solve math with KaTeX, and collaborate securely within a single unified interface.