by hashicorp
Provides integration with Terraform Registry APIs, enabling automation and interaction for Infrastructure as Code development via Model Context Protocol.
Terraform MCP Server enables advanced automation for Terraform users by exposing Registry APIs through a Model Context Protocol (MCP) server. It allows AI assistants, IDE extensions, and other tools to query providers, modules, and policies programmatically.
go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@latest
) or build the Docker image (make docker-build
).TRANSPORT_MODE=streamable-http
(or http
) for HTTP mode.http://{host}:8080/mcp
with health check at /health
. Adjust host, port, and CORS via environment variables (TRANSPORT_HOST
, TRANSPORT_PORT
, MCP_ALLOWED_ORIGINS
, etc.).Q: Which transport should I use? A: StdIO is ideal for local development and direct MCP clients. Use Streamable‑HTTP for remote or distributed setups.
Q: Do I need to set CORS headers?
A: In production, configure MCP_ALLOWED_ORIGINS
to restrict origins and prevent DNS‑rebinding attacks.
Q: How do I switch to stateless mode?
A: Set MCP_SESSION_MODE=stateless
before starting the server.
Q: Can I run the server without Docker?
A: Yes, install the Go binary and execute terraform-mcp-server stdio
or terraform-mcp-server streamable-http
.
Q: What licensing applies? A: The project is released under the MPL‑2.0 open source license.
The Terraform MCP Server is a Model Context Protocol (MCP) server that provides seamless integration with Terraform Registry APIs, enabling advanced automation and interaction capabilities for Infrastructure as Code (IaC) development.
Caution: The outputs and recommendations provided by the MCP server are generated dynamically and may vary based on the query, model, and the connected MCP server. Users should thoroughly review all outputs/recommendations to ensure they align with their organization's security best practices, cost-efficiency goals, and compliance requirements before implementation.
Security Note: When using the StreamableHTTP transport in production, always configure the
MCP_ALLOWED_ORIGINS
environment variable to restrict access to trusted origins only. This helps prevent DNS rebinding attacks and other cross-origin vulnerabilities.
The Terraform MCP Server supports multiple transport protocols:
Standard input/output communication using JSON-RPC messages. Ideal for local development and direct integration with MCP clients.
Modern HTTP-based transport supporting both direct HTTP requests and Server-Sent Events (SSE) streams. This is the recommended transport for remote/distributed setups.
Features:
http://{hostname}:8080/mcp
http://{hostname}:8080/health
TRANSPORT_MODE=http
or TRANSPORT_PORT=8080
to enableEnvironment Variables:
Variable | Description | Default |
---|---|---|
TRANSPORT_MODE |
Set to streamable-http to enable HTTP transport (legacy http value still supported) |
stdio |
TRANSPORT_HOST |
Host to bind the HTTP server | 127.0.0.1 |
TRANSPORT_PORT |
HTTP server port | 8080 |
MCP_ENDPOINT |
HTTP server endpoint path | /mcp |
MCP_SESSION_MODE |
Session mode: stateful or stateless |
stateful |
MCP_ALLOWED_ORIGINS |
Comma-separated list of allowed origins for CORS | "" (empty) |
MCP_CORS_MODE |
CORS mode: strict , development , or disabled |
strict |
# Stdio mode
terraform-mcp-server stdio [--log-file /path/to/log]
# StreamableHTTP mode
terraform-mcp-server streamable-http [--transport-port 8080] [--transport-host 127.0.0.1] [--mcp-endpoint /mcp] [--log-file /path/to/log]
The Terraform MCP Server supports two session modes when using the StreamableHTTP transport:
To enable stateless mode, set the environment variable:
export MCP_SESSION_MODE=stateless
Add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
More about using MCP server tools in VS Code's agent mode documentation.
{
"mcp": {
"servers": {
"terraform": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"hashicorp/terraform-mcp-server"
]
}
}
}
}
Optionally, you can add a similar example (i.e. without the mcp key) to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
{
"servers": {
"terraform": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"hashicorp/terraform-mcp-server"
]
}
}
}
More about using MCP server tools in Claude Desktop user documentation. Read more about using MCP server in Amazon Q from the documentation.
{
"mcpServers": {
"terraform": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"hashicorp/terraform-mcp-server"
]
}
}
}
The following sets of tools are available for the public Terraform registry:
Toolset | Tool | Description |
---|---|---|
providers |
search_providers |
Queries the Terraform Registry to find and list available documentation for a specific provider using the specified service_slug . Returns a list of provider document IDs with their titles and categories for resources, data sources, functions, or guides. |
providers |
get_provider_details |
Fetches the complete documentation content for a specific provider resource, data source, or function using a document ID obtained from the search_providers tool. Returns the raw documentation in markdown format. |
providers |
get_latest_provider_version |
Fetches the complete documentation content for a specific provider resource, data source, or function using a document ID obtained from the search_providers tool. Returns the raw documentation in markdown format. |
modules |
search_modules |
Searches the Terraform Registry for modules based on specified module_query with pagination. Returns a list of module IDs with their names, descriptions, download counts, verification status, and publish dates |
modules |
get_module_details |
Retrieves detailed documentation for a module using a module ID obtained from the search_modules tool including inputs, outputs, configuration, submodules, and examples. |
modules |
get_latest_module_version |
Retrieves detailed documentation for a module using a module ID obtained from the search_modules tool including inputs, outputs, configuration, submodules, and examples. |
policies |
search_policies |
Queries the Terraform Registry to find and list the appropriate Sentinel Policy based on the provided query policy_query . Returns a list of matching policies with terraform_policy_id(s) with their name, title and download counts. |
policies |
get_policy_details |
Retrieves detailed documentation for a policy set using a terraform_policy_id obtained from the search_policies tool including policy readme and implementation details. |
The following sets of tools are available for HCP Terraform or Terraform Enterprise:
Toolset | Tool | Description |
---|---|---|
orgs |
list_organizations |
Lists all Terraform organizations accessible to the authenticated user. |
projects |
list_projects |
Lists all projects within a specified Terraform organization. |
Resource URI | Description |
---|---|
/terraform/style-guide |
Terraform Style Guide - Provides access to the official Terraform style guide documentation in markdown format |
/terraform/module-development |
Terraform Module Development Guide - Comprehensive guide covering module composition, structure, providers, publishing, and refactoring best practices |
Resouce Template URI | Description |
---|---|
/terraform/providers/{namespace}/name/{name}/version/{version} |
Provider Resource Template - Dynamically retrieves detailed documentation and overview for any Terraform provider by namespace, name, and version |
Use the latest release version:
go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@latest
Use the main branch:
go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@main
{
"mcp": {
"servers": {
"terraform": {
"command": "/path/to/terraform-mcp-server",
"args": ["stdio"]
}
}
}
}
Before using the server, you need to build the Docker image locally:
git clone https://github.com/hashicorp/terraform-mcp-server.git
cd terraform-mcp-server
make docker-build
# Run in stdio mode
docker run -i --rm terraform-mcp-server:dev
# Run in streamable-http mode
docker run -p 8080:8080 --rm -e TRANSPORT_MODE=streamable-http -e TRANSPORT_HOST=0.0.0.0 terraform-mcp-server:dev
Note: When running in Docker, you should set
TRANSPORT_HOST=0.0.0.0
to allow connections from outside the container.
# Test the connection
curl http://localhost:8080/health
{
"mcpServers": {
"terraform": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"terraform-mcp-server:dev"
]
}
}
}
Command | Description |
---|---|
make build |
Build the binary |
make test |
Run all tests |
make test-e2e |
Run end-to-end tests |
make docker-build |
Build Docker image |
make run-http |
Run HTTP server locally |
make docker-run-http |
Run HTTP server in Docker |
make test-http |
Test HTTP health endpoint |
make clean |
Remove build artifacts |
make help |
Show all available commands |
This project is licensed under the terms of the MPL-2.0 open source license. Please refer to LICENSE file for the full terms.
For security issues, please contact security@hashicorp.com or follow our security policy.
For bug reports and feature requests, please open an issue on GitHub.
For general questions and discussions, open a GitHub Discussion.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.