by aws
Enables AI applications and tools to connect to MCP servers on AWS using AWS IAM (SigV4) authentication, handling request signing automatically and providing both a command‑line proxy and a Python library.
Provides a lightweight bridge that lets AI assistants, developer CLIs, and custom agent frameworks communicate with MCP servers secured with AWS IAM credentials. It abstracts away the complexity of SigV4 request signing, allowing seamless interaction without writing custom authentication code.
uvx mcp-proxy-for-aws@latest <endpoint> or via Docker (docker run --rm -v $HOME/.aws:/app/.aws:ro mcp-proxy-for-aws <endpoint>). Configure your MCP client (e.g., Claude Desktop, Amazon Q CLI) to point to the proxy.pip install mcp-proxy-for-aws. Import aws_iam_streamablehttp_client and pass it to frameworks such as LangChain, LlamaIndex, Strands Agents, or Microsoft Agent Framework to obtain an authenticated MCP session.--service, --region, --profile, metadata key‑value pairs, read‑only flag, retry count, timeouts, and log level via command‑line arguments or environment variables (AWS_PROFILE, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN, AWS_REGION).Q: Which AWS credentials are used?
A: The proxy reads credentials from the AWS CLI configuration, environment variables, or the instance/profile IAM role. You can explicitly set a profile with --profile.
Q: How do I specify the AWS service for SigV4 signing?
A: The --service flag overrides automatic inference. Required when the service name cannot be derived from the endpoint URL.
Q: What does the --read-only flag do?
A: It disables tools that require write permissions, exposing only read‑only capabilities annotated with readOnlyHint=true.
Q: Can I run the proxy in Docker?
A: Yes. Build the image with docker build -t mcp-proxy-for-aws . and run it, mounting your .aws folder to provide credentials.
Q: How do I integrate with LangChain?
A: Import aws_iam_streamablehttp_client, create a client, and use it to build a ClientSession that LangChain can consume for tool loading.
Q: What timeout values are recommended?
A: Defaults are 180 s overall, 60 s connect, 120 s read, and 180 s write, but they can be tuned via --timeout, --connect-timeout, --read-timeout, and --write-timeout.
Q: How can I troubleshoot authentication errors?
A: Ensure the correct --service is set, verify that valid IAM credentials are available, and check that the endpoint matches the service you are signing for.
The MCP Proxy for AWS package provides two ways to connect AI applications to MCP servers on AWS:
The Problem: The official MCP specification supports OAuth-based authentication, but MCP servers on AWS can also use AWS IAM authentication (SigV4). Standard MCP clients don't know how to sign requests with AWS credentials.
The Solution: This package bridges that gap by:
Use as a proxy if you want to:
Use as a library if you want to:
uv package managerThe MCP Proxy serves as a lightweight, client-side bridge between MCP clients (AI assistants and developer tools) and IAM-secured MCP servers on AWS. The proxy handles SigV4 authentication using local AWS credentials and provides dynamic tool discovery.
# Run the server
uvx mcp-proxy-for-aws@latest <SigV4 MCP endpoint URL>
Note: The first run may take tens of seconds as uvx downloads and caches dependencies. Subsequent runs will start in seconds. Actual startup time depends on your network and hardware.
git clone https://github.com/aws/mcp-proxy-for-aws.git
cd mcp-proxy-for-aws
uv run mcp_proxy_for_aws/server.py <SigV4 MCP endpoint URL>
# Build the Docker image
docker build -t mcp-proxy-for-aws .
| Parameter | Description | Default | Required |
|---|---|---|---|
endpoint |
MCP endpoint URL (e.g., https://your-service.us-east-1.amazonaws.com/mcp) |
N/A | Yes |
| --- | --- | --- | --- |
--service |
AWS service name for SigV4 signing, if omitted we try to infer this from the url | Inferred from endpoint if not provided | No |
--profile |
AWS profile for AWS credentials to use | Uses AWS_PROFILE environment variable if not set |
No |
--region |
AWS region to use | Uses AWS_REGION environment variable if not set, defaults to us-east-1 |
No |
--metadata |
Metadata to inject into MCP requests as key=value pairs (e.g., --metadata KEY1=value1 KEY2=value2) |
AWS_REGION is automatically injected based on --region if not provided |
No |
--read-only |
Disable tools which may require write permissions (tools which DO NOT require write permissions are annotated with readOnlyHint=true) |
False |
No |
--retries |
Configures number of retries done when calling upstream services, setting this to 0 disables retries. | 0 | No |
--log-level |
Set the logging level (DEBUG/INFO/WARNING/ERROR/CRITICAL) |
INFO |
No |
--timeout |
Set desired timeout in seconds across all operations | 180 | No |
--connect-timeout |
Set desired connect timeout in seconds | 60 | No |
--read-timeout |
Set desired read timeout in seconds | 120 | No |
--write-timeout |
Set desired write timeout in seconds | 180 | No |
Set the environment variables for the MCP Proxy for AWS:
# Credentials through profile
export AWS_PROFILE=<aws_profile>
# Credentials through parameters
export AWS_ACCESS_KEY_ID=<access_key_id>
export AWS_SECRET_ACCESS_KEY=<secret_access_key>
export AWS_SESSION_TOKEN=<session_token>
# AWS Region
export AWS_REGION=<aws_region>
Add the following configuration to your MCP client config file (e.g., for Amazon Q Developer CLI, edit ~/.aws/amazonq/mcp.json):
Note Add your own endpoint by replacing <SigV4 MCP endpoint URL>
{
"mcpServers": {
"<mcp server name>": {
"disabled": false,
"type": "stdio",
"command": "uv",
"args": [
"--directory",
"/path/to/mcp_proxy_for_aws",
"run",
"server.py",
"<SigV4 MCP endpoint URL>",
"--service",
"<your service code>",
"--profile",
"default",
"--region",
"us-east-1",
"--read-only",
"--log-level",
"INFO",
]
}
}
}
[!NOTE] Cline users should not use
--log-levelargument because Cline checks the log messages in stderr for text "error" (case insensitive).
{
"mcpServers": {
"<mcp server name>": {
"command": "docker",
"args": [
"run",
"--rm",
"--volume",
"/full/path/to/.aws:/app/.aws:ro",
"mcp-proxy-for-aws",
"<SigV4 MCP endpoint URL>"
],
"env": {}
}
}
}
The MCP Proxy for AWS enables programmatic integration of IAM-secured MCP servers into AI agent frameworks. The library provides authenticated transport layers that work with popular Python AI frameworks.
The library supports two integration patterns depending on your framework:
Use with: Frameworks that accept a factory function that returns an MCP client, e.g. Strands Agents, Microsoft Agent Framework. The aws_iam_streamablehttp_client is passed as a factory to the framework, which handles the connection lifecycle internally.
Example - Strands Agents:
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client
mcp_client_factory = lambda: aws_iam_streamablehttp_client(
endpoint=mcp_url, # The URL of the MCP server
aws_region=region, # The region of the MCP server
aws_service=service # The underlying AWS service, e.g. "bedrock-agentcore"
)
with MCPClient(mcp_client_factory) as mcp_client:
mcp_tools = mcp_client.list_tools_sync()
agent = Agent(tools=mcp_tools, ...)
Example - Microsoft Agent Framework:
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client
mcp_client_factory = lambda: aws_iam_streamablehttp_client(
endpoint=mcp_url, # The URL of the MCP server
aws_region=region, # The region of the MCP server
aws_service=service # The underlying AWS service, e.g. "bedrock-agentcore"
)
mcp_tools = MCPStreamableHTTPTool(name="MCP Tools", url=mcp_url)
mcp_tools.get_mcp_client = mcp_client_factory
async with mcp_tools:
agent = ChatAgent(tools=[mcp_tools], ...)
Use with: Frameworks that require direct access to the MCP sessions, e.g. LangChain, LlamaIndex. The aws_iam_streamablehttp_client provides the authenticated transport streams, which are then used to create an MCP ClientSession.
Example - LangChain:
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client
mcp_client = aws_iam_streamablehttp_client(
endpoint=mcp_url, # The URL of the MCP server
aws_region=region, # The region of the MCP server
aws_service=service # The underlying AWS service, e.g. "bedrock-agentcore"
)
async with mcp_client as (read, write, session_id_callback):
async with ClientSession(read, write) as session:
mcp_tools = await load_mcp_tools(session)
agent = create_langchain_agent(tools=mcp_tools, ...)
Example - LlamaIndex:
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client
mcp_client = aws_iam_streamablehttp_client(
endpoint=mcp_url, # The URL of the MCP server
aws_region=region, # The region of the MCP server
aws_service=service # The underlying AWS service, e.g. "bedrock-agentcore"
)
async with mcp_client as (read, write, session_id_callback):
async with ClientSession(read, write) as session:
mcp_tools = await McpToolSpec(client=session).to_tool_list_async()
agent = ReActAgent(tools=mcp_tools, ...)
Explore complete working examples for different frameworks in the ./examples/mcp-client directory:
Available examples:
Run examples individually:
cd examples/mcp-client/[framework] # e.g. examples/mcp-client/strands
uv run main.py
The client library is included when you install the package:
pip install mcp-proxy-for-aws
For development:
git clone https://github.com/aws/mcp-proxy-for-aws.git
cd mcp-proxy-for-aws
uv sync
Authentication error - Invalid credentialsWe try to autodetect the service from the url, sometimes this fails, ensure that --service is set correctly to the
service you are attempting to connect to.
Otherwise the SigV4 signing will not be able to be verified by the service you connect to, resulting in this error.
Also ensure that you have valid IAM credentials on your machine before retrying.
For development setup, testing, and contribution guidelines, see:
Resources to understand SigV4:
Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License").
LLMs are non-deterministic and they make mistakes, we advise you to always thoroughly test and follow the best practices of your organization before using these tools on customer facing accounts. Users of this package are solely responsible for implementing proper security controls and MUST use AWS Identity and Access Management (IAM) to manage access to AWS resources. You are responsible for configuring appropriate IAM policies, roles, and permissions, and any security vulnerabilities resulting from improper IAM configuration are your sole responsibility. By using this package, you acknowledge that you have read and understood this disclaimer and agree to use the package at your own risk.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.