by RamXX
Enables AI-driven web and news searches with Tavily's API, offering tools for comprehensive queries, answer generation, and domain filtering for language models.
Mcp Tavily provides Model Context Protocol (MCP) tools that let large language models perform AI‑powered web searches, generate direct answers with supporting evidence, and retrieve recent news articles. It wraps Tavily's search API and exposes three distinct tools—web search, answer‑search, and news search—each configurable with result limits, depth, and domain inclusion/exclusion.
pip install mcp-tavily # or: uv add mcp-tavily
.env
file, environment variable, or command‑line argument:
export TAVILY_API_KEY=your_key_here
python -m mcp_server_tavily # runs on the default port 8000
The server can also be launched from VS Code using the supplied mcp
configuration or via Docker (make docker-build && make docker-run
).max_results
, search_depth
, include_domains
, and exclude_domains
for fine‑grained control.Q: How do I supply my API key?
A: Place TAVILY_API_KEY=your_key
in a .env
file, export it as an environment variable, or pass --api-key=your_key
when launching the server.
Q: Which Python version is required? A: Python 3.11 or later.
Q: Can I run Mcp Tavily in a container?
A: Yes. Use make docker-build
to build the image and make docker-run
to start it, optionally overriding TAVILY_API_KEY
.
Q: What limits exist for search results? A: Up to 20 results per request; defaults to 5.
Q: Is there support for advanced search depth?
A: search_depth
can be set to "basic" or "advanced" (default varies per tool).
Q: How do I integrate with VS Code?
A: Add the provided JSON snippet under the mcp
setting in settings.json
or .vscode/mcp.json
.
A Model Context Protocol server that provides AI-powered web search capabilities using Tavily's search API. This server enables LLMs to perform sophisticated web searches, get direct answers to questions, and search recent news articles with AI-extracted relevant content.
tavily_web_search
- Performs comprehensive web searches with AI-powered content extraction.
query
(string, required): Search querymax_results
(integer, optional): Maximum number of results to return (default: 5, max: 20)search_depth
(string, optional): Either "basic" or "advanced" search depth (default: "basic")include_domains
(list or string, optional): List of domains to specifically include in resultsexclude_domains
(list or string, optional): List of domains to exclude from resultstavily_answer_search
- Performs web searches and generates direct answers with supporting evidence.
query
(string, required): Search querymax_results
(integer, optional): Maximum number of results to return (default: 5, max: 20)search_depth
(string, optional): Either "basic" or "advanced" search depth (default: "advanced")include_domains
(list or string, optional): List of domains to specifically include in resultsexclude_domains
(list or string, optional): List of domains to exclude from resultstavily_news_search
- Searches recent news articles with publication dates.
query
(string, required): Search querymax_results
(integer, optional): Maximum number of results to return (default: 5, max: 20)days
(integer, optional): Number of days back to search (default: 3)include_domains
(list or string, optional): List of domains to specifically include in resultsexclude_domains
(list or string, optional): List of domains to exclude from resultsThe server also provides prompt templates for each search type:
uv
Python package manager (recommended)# With pip
pip install mcp-tavily
# Or with uv (recommended)
uv add mcp-tavily
You should see output similar to:
Resolved packages: mcp-tavily, mcp, pydantic, python-dotenv, tavily-python [...]
Successfully installed mcp-tavily-0.1.4 mcp-1.0.0 [...]
# Clone the repository
git clone https://github.com/RamXX/mcp-tavily.git
cd mcp-tavily
# Create a virtual environment (optional but recommended)
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies and build
uv sync # Or: pip install -r requirements.txt
uv build # Or: pip install -e .
# To install with test dependencies:
uv sync --dev # Or: pip install -r requirements-dev.txt
During installation, you should see the package being built and installed with its dependencies.
For quick installation, use one of the one-click install buttons below:
For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
Note that the
mcp
key is not needed in the.vscode/mcp.json
file.
{
"mcp": {
"inputs": [
{
"type": "promptString",
"id": "apiKey",
"description": "Tavily API Key",
"password": true
}
],
"servers": {
"tavily": {
"command": "uvx",
"args": ["mcp-tavily"],
"env": {
"TAVILY_API_KEY": "${input:apiKey}"
}
}
}
}
}
The server requires a Tavily API key, which can be provided in three ways:
Through a .env
file in your project directory:
TAVILY_API_KEY=your_api_key_here
As an environment variable:
export TAVILY_API_KEY=your_api_key_here
As a command-line argument:
python -m mcp_server_tavily --api-key=your_api_key_here
Add to your Claude settings:
"mcpServers": {
"tavily": {
"command": "python",
"args": ["-m", "mcp_server_tavily"]
},
"env": {
"TAVILY_API_KEY": "your_api_key_here"
}
}
If you encounter issues, you may need to specify the full path to your Python interpreter. Run which python
to find the exact path.
For a regular web search:
Tell me about Anthropic's newly released MCP protocol
To generate a report with domain filtering:
Tell me about redwood trees. Please use MLA format in markdown syntax and include the URLs in the citations. Exclude Wikipedia sources.
To use answer search mode for direct answers:
I want a concrete answer backed by current web sources: What is the average lifespan of redwood trees?
For news search:
Give me the top 10 AI-related news in the last 5 days
The project includes a comprehensive test suite with automated dependency compatibility testing.
Install test dependencies:
source .venv/bin/activate # If using a virtual environment
uv sync --dev # Or: pip install -r requirements-dev.txt
Run the standard test suite:
./tests/run_tests.sh
# Or using Make
make test
To ensure the project works with the latest dependency versions, use these commands:
# Test with latest dependencies using Make
make test-deps
# Full compatibility test with verbose output
make test-compatibility
# Or use the standalone script
./scripts/test-compatibility.sh
These commands will:
The project includes automated dependency compatibility testing through GitHub Actions:
When tests pass: Your project is compatible with the latest dependency versions. You can safely update your requirements files.
When tests fail: Review the test output to identify breaking changes, update your code to handle API changes, update tests if needed, or consider pinning problematic dependency versions.
You should see output similar to:
======================================================= test session starts ========================================================
platform darwin -- Python 3.13.3, pytest-8.3.5, pluggy-1.5.0
rootdir: /Users/ramirosalas/workspace/mcp-tavily
configfile: pyproject.toml
plugins: cov-6.0.0, asyncio-0.25.3, anyio-4.8.0, mock-3.14.0
asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=function
collected 50 items
tests/test_docker.py .. [ 4%]
tests/test_integration.py ..... [ 14%]
tests/test_models.py ................. [ 48%]
tests/test_server_api.py ..................... [ 90%]
tests/test_utils.py ..... [100%]
---------- coverage: platform darwin, python 3.13.3-final-0 ----------
Name Stmts Miss Cover
-------------------------------------------------------
src/mcp_server_tavily/__init__.py 16 2 88%
src/mcp_server_tavily/__main__.py 2 2 0%
src/mcp_server_tavily/server.py 149 16 89%
-------------------------------------------------------
TOTAL 167 20 88%
The test suite includes tests for data models, utility functions, integration testing, error handling, and parameter validation. It focuses on verifying that all API capabilities work correctly, including handling of domain filters and various input formats.
The project includes tools for building and releasing with the latest dependency versions:
# Build package with latest dependency versions
make build-latest
# Complete release workflow: test, build, and check with latest deps
make release-all
# Prepare a release with version management
./scripts/prepare-release.sh [new_version]
Recommended approach for releases with latest dependencies:
make release-all
make upload-latest
Alternative step-by-step approach:
make test-compatibility
make release-build
make upload-latest
One-command release and publish:
make release-publish
Important: Use make upload-latest
instead of make upload
to prevent dependency downgrades during the upload process. The upload-latest
command uses existing distribution files without reinstalling dependencies.
The release commands ensure your package is built and tested with the most recent compatible dependency versions, preventing the downgrades that can occur with traditional build chains.
Build the Docker image:
make docker-build
Alternatively, build directly with Docker:
docker build -t mcp_tavily .
Run a detached Docker container (default name mcp_tavily_container
, port 8000 → 8000):
make docker-run
Or manually:
docker run -d --name mcp_tavily_container \
-e TAVILY_API_KEY=your_api_key_here \
-p 8000:8000 mcp_tavily
Stop and remove the container:
make docker-stop
Follow container logs:
make docker-logs
You can override defaults by setting environment variables:
mcp_tavily
)mcp_tavily_container
)8000
)8000
)You can use the MCP inspector to debug the server:
# Using npx
npx @modelcontextprotocol/inspector python -m mcp_server_tavily
# For development
cd path/to/mcp-tavily
npx @modelcontextprotocol/inspector python -m mcp_server_tavily
We welcome contributions to improve mcp-tavily! Here's how you can help:
git checkout -b feature/amazing-feature
)git commit -m 'Add amazing feature'
)git push origin feature/amazing-feature
)For examples of other MCP servers and implementation patterns, see: https://github.com/modelcontextprotocol/servers
mcp-tavily is licensed under the MIT License. See the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "tavily": { "command": "python", "args": [ "-m", "mcp_server_tavily" ], "env": { "TAVILY_API_KEY": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
by liuyoshio
Provides natural‑language search and recommendation for Model Context Protocol servers, delivering rich metadata and real‑time updates.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by fatwang2
Provides web and news search, URL crawling, sitemap extraction, deep‑reasoning, and trending topic retrieval via Search1API, exposed as an MCP server for integration with AI clients.
by cnych
Provides SEO data retrieval via Ahrefs, exposing MCP tools for backlink analysis, keyword generation, traffic estimation, and keyword difficulty, with automated CAPTCHA solving and response caching.