by baryhuang
Run generated Python code to query and manage AWS resources through boto3, with sandboxed execution and containerization for safe interaction.
AWS Resources MCP Server provides a Model Context Protocol (MCP) endpoint that executes Python snippets using boto3. The service lets Claude (or other MCP‑compatible clients) directly query or modify any AWS service permitted by the supplied credentials, all inside a sandboxed Docker container.
AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, optionally AWS_SESSION_TOKEN
, and AWS_DEFAULT_REGION
(or use an AWS_PROFILE
).buryhuang/mcp-server-aws-resources:latest
and start it, passing the credential environment variables or mounting ~/.aws
for a profile.aws_resources_query_or_modify
tool. Provide a code_snippet
string that contains boto3 calls and assigns the final output to a variable named result
.result
(including dates and AWS‑specific objects) back to the client.{
"mcpServers": {
"aws-resources": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"AWS_ACCESS_KEY_ID=YOUR_KEY",
"-e",
"AWS_SECRET_ACCESS_KEY=YOUR_SECRET",
"-e",
"AWS_DEFAULT_REGION=us-east-1",
"buryhuang/mcp-server-aws-resources:latest"
]
}
}
}
Q: Is the server read‑only?
A: No. It runs any boto3 code, so operations depend on the IAM permissions of the supplied credentials.
Q: How does the sandbox protect me?
A: The code is parsed with AST to allow only a whitelist of imports (boto3
, operator
, json
, datetime
, pytz
, dateutil
, re
, time
) and a limited set of built‑ins. The execution environment is isolated inside Docker.
Q: Can I use an AWS profile instead of raw keys?
A: Yes. Set AWS_PROFILE
and mount ~/.aws
into the container (-v ~/.aws:/root/.aws
).
Q: What languages can I write the snippet in?
A: Python only, using the boto3 SDK.
Q: How do I install without Docker?
A: The repository can be run via uv run src/mcp_server_aws_resources/server.py
after cloning, but Docker is the recommended, zero‑setup method.
A Model Context Protocol (MCP) server implementation that provides running generated python code to query any AWS resources through boto3.
At your own risk: I didn't limit the operations to ReadyOnly, so that cautious Ops people can be helped using this tool doing management operations. Your AWS user role will dictate the permissions for what you can do.
Demo: Fix Dynamodb Permission Error
https://github.com/user-attachments/assets/de88688d-d7a0-45e1-94eb-3f5d71e9a7c7
I tried AWS Chatbot with Developer Access. Free Tier has a limit of 25 query/month for resources. Next tier is $19/month include 90% of the features I don't use. And the results are in a fashion of JSON and a lot of restrictions.
I tried using aws-mcp but ran into a few issues:
So I created this new approach that:
For more information about the Model Context Protocol and how it works, see Anthropic's MCP documentation.
The server exposes the following resource:
aws://query_resources
: A dynamic resource that provides access to AWS resources through boto3 queriesHere are some example queries you can execute:
s3 = session.client('s3')
result = s3.list_buckets()
def get_latest_deployment(pipeline_name):
codepipeline = session.client('codepipeline')
result = codepipeline.list_pipeline_executions(
pipelineName=pipeline_name,
maxResults=5
)
if result['pipelineExecutionSummaries']:
latest_execution = max(
[e for e in result['pipelineExecutionSummaries']
if e['status'] == 'Succeeded'],
key=itemgetter('startTime'),
default=None
)
if latest_execution:
result = codepipeline.get_pipeline_execution(
pipelineName=pipeline_name,
pipelineExecutionId=latest_execution['pipelineExecutionId']
)
else:
result = None
else:
result = None
return result
result = get_latest_deployment("your-pipeline-name")
Note: All code snippets must set a result
variable that will be returned to the client. The result
variable will be automatically converted to JSON format, with proper handling of AWS-specific objects and datetime values.
The server offers a tool for executing AWS queries:
aws_resources_query_or_modify
code_snippet
(string): Python code using boto3 to query AWS resourcesresult
variable with the query outputThe server includes several safety features:
You'll need AWS credentials with appropriate permissions to query AWS resources. You can obtain these by:
The following environment variables are required:
AWS_ACCESS_KEY_ID
: Your AWS access keyAWS_SECRET_ACCESS_KEY
: Your AWS secret keyAWS_SESSION_TOKEN
: (Optional) AWS session token if using temporary credentialsAWS_DEFAULT_REGION
: AWS region (defaults to 'us-east-1' if not set)You can also use a profile stored in the ~/.aws/credentials
file. To do this, set the AWS_PROFILE
environment variable to the profile name.
Note: Keep your AWS credentials secure and never commit them to version control.
To install AWS Resources MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-server-aws-resources-python --client claude
You can either build the image locally or pull it from Docker Hub. The image is built for the Linux platform.
docker pull buryhuang/mcp-server-aws-resources:latest
docker build -t mcp-server-aws-resources .
Run the container:
docker run \
-e AWS_ACCESS_KEY_ID=your_access_key_id_here \
-e AWS_SECRET_ACCESS_KEY=your_secret_access_key_here \
-e AWS_DEFAULT_REGION=your_AWS_DEFAULT_REGION \
buryhuang/mcp-server-aws-resources:latest
Or using stored credentials and a profile:
docker run \
-e AWS_PROFILE=[AWS_PROFILE_NAME] \
-v ~/.aws:/root/.aws \
buryhuang/mcp-server-aws-resources:latest
To publish the Docker image for multiple platforms, you can use the docker buildx
command. Follow these steps:
Create a new builder instance (if you haven't already):
docker buildx create --use
Build and push the image for multiple platforms:
docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 -t buryhuang/mcp-server-aws-resources:latest --push .
Verify the image is available for the specified platforms:
docker buildx imagetools inspect buryhuang/mcp-server-aws-resources:latest
{
"mcpServers": {
"aws-resources": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"AWS_ACCESS_KEY_ID=your_access_key_id_here",
"-e",
"AWS_SECRET_ACCESS_KEY=your_secret_access_key_here",
"-e",
"AWS_DEFAULT_REGION=us-east-1",
"buryhuang/mcp-server-aws-resources:latest"
]
}
}
}
{
"mcpServers": {
"aws-resources": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"AWS_PROFILE=default",
"-v",
"~/.aws:/root/.aws",
"buryhuang/mcp-server-aws-resources:latest"
]
}
}
}
{
"mcpServers": {
"aws": {
"command": "/Users/gmr/.local/bin/uv",
"args": [
"--directory",
"/<your-path>/mcp-server-aws-resources-python",
"run",
"src/mcp_server_aws_resources/server.py",
"--profile",
"testing"
]
}
}
}
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by awslabs
Provides specialized servers that expose AWS capabilities through the Model Context Protocol, enabling AI assistants to retrieve up-to-date documentation, execute API calls, and automate infrastructure workflows directly within development environments.
by cloudflare
Provides a collection of Model Context Protocol servers that enable MCP‑compatible clients to interact with Cloudflare services such as Workers, Observability, Radar, and more, allowing natural‑language driven management of configurations, data, and operations.
by Flux159
Connects to a Kubernetes cluster and offers a unified MCP interface for kubectl, Helm, port‑forwarding, diagnostics, and non‑destructive read‑only mode.
by TencentEdgeOne
Deploy HTML, folders, or zip archives to EdgeOne Pages and instantly obtain a public URL for fast edge delivery.
by rishikavikondala
Provides Model Context Protocol tools for performing AWS S3 and DynamoDB operations, with automatic logging and audit access via the `audit://aws-operations` endpoint.
by confluentinc
Enables AI assistants to manage Confluent Cloud resources such as Kafka topics, connectors, and Flink SQL statements through natural‑language interactions.
by aliyun
Enables AI assistants to operate Alibaba Cloud resources such as ECS, Cloud Monitor, OOS and other services through seamless integration with Alibaba Cloud APIs via the Model Context Protocol.
by aws-samples
Retrieve PDF documents and other S3 objects through Model Context Protocol resources, enabling LLMs to pull data directly from AWS S3 buckets.
by kocierik
Connects to HashiCorp Nomad and exposes Model Context Protocol endpoints for job, deployment, node, allocation, variable, volume, ACL, Sentinel, and cluster management via a Go‑based server.