by googleapis
An MCP server that streamlines database tool development by handling connection pooling, authentication, observability, and secure access, allowing agents to interact with databases via natural language.
MCP Toolbox for Databases provides a centralized control plane that sits between applications (or AI agents) and databases. It abstracts away low‑level concerns such as connection management, authentication, and tracing, exposing high‑level tools that agents can invoke to query, modify, or manage database resources.
tools.yaml
configuration – define sources
(e.g., Postgres connection details), tools
(SQL statements or actions), and group them into toolsets
.toolbox --tools-file "tools.yaml"
(or the equivalent Docker command). The server listens on port 5000 by default.tools.yaml
are picked up automatically without restarting the server.Q: Is the API stable? A: The project is currently in beta (pre‑1.0). APIs may change until a stable 1.0 release.
Q: Which databases are supported?
A: The toolbox includes source kinds such as postgres
. Additional source types can be added via future releases.
Q: How do I disable dynamic reloading?
A: Start the server with the --disable-reload
flag.
Q: Can I run the server in Kubernetes?
A: Yes. Pull the container image (us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:<VERSION>
) and deploy it as a standard pod, mounting your tools.yaml
via a volume.
Q: How do I expose the server securely? A: Use standard TLS termination in front of the container or configure authentication mechanisms provided by your deployment platform.
[!NOTE] MCP Toolbox for Databases is currently in beta, and may see breaking changes until the first stable release (v1.0).
MCP Toolbox for Databases is an open source MCP server for databases. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.
This README provides a brief overview. For comprehensive details, see the full documentation.
[!NOTE] This solution was originally named “Gen AI Toolbox for Databases” as its initial development predated MCP, but was renamed to align with recently added MCP compatibility.
Toolbox helps you build Gen AI tools that let your agents access data in your database. Toolbox provides:
⚡ Supercharge Your Workflow with an AI Database Assistant ⚡
Stop context-switching and let your AI assistant become a true co-developer. By connecting your IDE to your databases with MCP Toolbox, you can delegate complex and time-consuming database tasks, allowing you to build faster and focus on what matters. This isn't just about code completion; it's about giving your AI the context it needs to handle the entire development lifecycle.
Here’s how it will save you time:
Learn how to connect your AI tools (IDEs) to Toolbox using MCP.
Toolbox sits between your application's orchestration framework and your database, providing a control plane that is used to modify, distribute, or invoke tools. It simplifies the management of your tools by providing you with a centralized location to store and update tools, allowing you to share tools between agents and applications and update those tools without necessarily redeploying your application.
For the latest version, check the releases page and use the following instructions for your OS and CPU architecture.
To install Toolbox as a binary:
# see releases page for other versions
export VERSION=0.14.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
# see releases page for other versions
export VERSION=0.14.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
To install Toolbox using Homebrew on macOS or Linux:
brew install mcp-toolbox
To install from source, ensure you have the latest version of Go installed, and then run the following command:
go install github.com/googleapis/genai-toolbox@v0.14.0
Configure a tools.yaml
to define your tools, and then
execute toolbox
to start the server:
To run Toolbox from binary:
./toolbox --tools-file "tools.yaml"
ⓘ NOTE:
Toolbox enables dynamic reloading by default. To disable, use the --disable-reload
flag.
To run the server after pulling the container image:
export VERSION=0.11.0 # Use the version you pulled
docker run -p 5000:5000 \
-v $(pwd)/tools.yaml:/app/tools.yaml \
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
--tools-file "/app/tools.yaml"
ⓘ NOTE:
The -v
flag mounts your local tools.yaml
into the container, and -p
maps the container's port 5000
to your host's port 5000
.
To run the server directly from source, navigate to the project root directory and run:
go run .
ⓘ NOTE:
This command runs the project from source, and is more suitable for development and testing. It does not compile a binary into your $GOPATH
. If you want to compile a binary instead, refer the Developer Documentation.
If you installed Toolbox using Homebrew, the toolbox
binary is available in your system path. You can start the server with the same command:
toolbox --tools-file "tools.yaml"
You can use toolbox help
for a full list of flags! To stop the server, send a
terminate signal (ctrl+c
on most platforms).
For more detailed documentation on deploying to different environments, check out the resources in the How-to section
Once your server is up and running, you can load the tools into your application. See below the list of Client SDKs for using various frameworks:
Install Toolbox Core SDK:
pip install toolbox-core
Load tools:
from toolbox_core import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = await client.load_toolset("toolset_name")
For more detailed instructions on using the Toolbox Core SDK, see the project's README.
Install Toolbox LangChain SDK:
pip install toolbox-langchain
Load tools:
from toolbox_langchain import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = client.load_toolset()
For more detailed instructions on using the Toolbox LangChain SDK, see the project's README.
Install Toolbox Llamaindex SDK:
pip install toolbox-llamaindex
Load tools:
from toolbox_llamaindex import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = client.load_toolset()
For more detailed instructions on using the Toolbox Llamaindex SDK, see the project's README.
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const tools = await client.loadToolset('toolsetName');
For more detailed instructions on using the Toolbox Core SDK, see the project's README.
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(currTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
// Use these tools in your Langchain/Langraph applications
const tools = toolboxTools.map(getTool);
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
import { genkit } from 'genkit';
// Initialise genkit
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => ai.defineTool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
}, toolboxTool)
// Use these tools in your Genkit applications
const tools = toolboxTools.map(getTool);
Install Toolbox Go SDK:
go get github.com/googleapis/mcp-toolbox-sdk-go
Load tools:
package main
import (
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"context"
)
func main() {
// Make sure to add the error checks
// update the url to point to your server
URL := "http://127.0.0.1:5000";
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tools
tools, err := client.LoadToolset("toolsetName", ctx)
}
For more detailed instructions on using the Toolbox Go SDK, see the project's README.
Install Toolbox Go SDK:
go get github.com/googleapis/mcp-toolbox-sdk-go
Load tools:
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
)
func main() {
// Make sure to add the error checks
// update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, ¶msSchema)
// Use this tool with LangChainGo
langChainTool := llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: tool.Name(),
Description: tool.Description(),
Parameters: paramsSchema,
},
}
}
Install Toolbox Go SDK:
go get github.com/googleapis/mcp-toolbox-sdk-go
Load tools:
package main
import (
"context"
"encoding/json"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/invopop/jsonschema"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
g, err := genkit.Init(ctx)
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Convert the tool using the tbgenkit package
// Use this tool with Genkit Go
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
}
Install Toolbox Go SDK:
go get github.com/googleapis/mcp-toolbox-sdk-go
Load tools:
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var schema *genai.Schema
_ = json.Unmarshal(inputschema, &schema)
funcDeclaration := &genai.FunctionDeclaration{
Name: tool.Name(),
Description: tool.Description(),
Parameters: schema,
}
// Use this tool with Go GenAI
genAITool := &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
Install Toolbox Go SDK:
go get github.com/googleapis/mcp-toolbox-sdk-go
Load tools:
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var paramsSchema openai.FunctionParameters
_ = json.Unmarshal(inputschema, ¶msSchema)
// Use this tool with OpenAI Go
openAITool := openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: tool.Name(),
Description: openai.String(tool.Description()),
Parameters: paramsSchema,
},
}
}
The primary way to configure Toolbox is through the tools.yaml
file. If you
have multiple files, you can tell toolbox which to load with the --tools-file tools.yaml
flag.
You can find more detailed reference documentation to all resource types in the Resources.
The sources
section of your tools.yaml
defines what data sources your
Toolbox should have access to. Most tools will have at least one source to
execute against.
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: toolbox_user
password: my-password
For more details on configuring different types of sources, see the Sources.
The tools
section of a tools.yaml
define the actions an agent can take: what
kind of tool it is, which source(s) it affects, what parameters it uses, etc.
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
For more details on configuring different types of tools, see the Tools.
The toolsets
section of your tools.yaml
allows you to define groups of tools
that you want to be able to load together. This can be useful for defining
different groups based on agent or application.
toolsets:
my_first_toolset:
- my_first_tool
- my_second_tool
my_second_toolset:
- my_second_tool
- my_third_tool
You can load toolsets by name:
# This will load all tools
all_tools = client.load_toolset()
# This will only load the tools listed in 'my_second_toolset'
my_second_toolset = client.load_toolset("my_second_toolset")
This project uses semantic versioning (MAJOR.MINOR.PATCH
).
Since the project is in a pre-release stage (version 0.x.y
), we follow the
standard conventions for initial development:
While the major version is 0
, the public API should be considered unstable.
The version will be incremented as follows:
0.MINOR.PATCH
: The MINOR version is incremented when we add
new functionality or make breaking, incompatible API changes.0.MINOR.PATCH
: The PATCH version is incremented for
backward-compatible bug fixes.Once the project reaches a stable 1.0.0
release, the versioning will follow
the more common convention:
MAJOR.MINOR.PATCH
: Incremented for incompatible API changes.MAJOR.MINOR.PATCH
: Incremented for new, backward-compatible functionality.MAJOR.MINOR.PATCH
: Incremented for backward-compatible bug fixes.The public API that this applies to is the CLI associated with Toolbox, the
interactions with official SDKs, and the definitions in the tools.yaml
file.
Contributions are welcome. Please, see the CONTRIBUTING to get started.
Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.
Join our discord community to connect with our developers!
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by bytebase
Provides a universal gateway that lets MCP‑compatible clients explore and query MySQL, PostgreSQL, SQL Server, MariaDB, and SQLite databases through a single standardized interface.
by designcomputer
Enables secure interaction with MySQL databases via the Model Context Protocol, allowing AI applications to list tables, read contents, and execute queries safely.
by benborla
Provides read‑only access to MySQL databases for large language models, allowing schema inspection and safe execution of SQL queries.
by ClickHouse
Enables AI assistants to run read‑only ClickHouse queries, list databases and tables, and execute embedded chDB queries through an MCP interface.
by chroma-core
Offers an MCP server exposing Chroma's vector database capabilities for LLM applications, supporting collection and document management, multiple embedding functions, and flexible client types such as in‑memory, persistent, HTTP, and cloud.
by kiliczsh
Enables LLMs to interact with MongoDB databases via a standardized interface, offering schema inspection, query execution, aggregation, and write capabilities, with optional read‑only mode and smart ObjectId handling.
by domdomegg
Provides read and write access to Airtable bases for AI systems, enabling inspection of schemas and manipulation of records.
by XGenerationLab
A Model Context Protocol (MCP) server that enables natural language queries to databases
by apache
Provides an MCP backend service built with Python and FastAPI to interact with Apache Doris databases, enabling natural language to SQL conversion, query execution, metadata extraction, and comprehensive enterprise data governance.