by TheInformationLab
Enables natural language interaction with Tableau data through AI-driven queries, supporting both Tableau Server and Tableau Cloud environments.
Provides an integration that brings AI capabilities to Tableau Server or Tableau Cloud using the Model Context Protocol (MCP) together with LangChain. Users can ask questions in plain English and receive answers, visualisations, or data extracts directly from the Tableau data they trust.
npm install and npm run build.git clone https://github.com/TheInformationLab/tableau_mcp_starter_kit.git.pip install -r requirements.txt..env_template to .env and fill in Tableau server details, PAT credentials, and AI model keys (e.g., OPENAI_API_KEY).python web_app.py and open http://localhost:8000 in a browser, or run the dashboard‑extension mode with python dashboard_app.py after setting up the custom MCP tools.tableau-mcp-experimental).Q: Do I need a Tableau licence? A: Yes, you need access to Tableau Server (2025.1+) or Tableau Cloud. A free trial is available via the Tableau Developer Program.
Q: Which Python version is required? A: Python 3.12 or newer.
Q: Can I use a local LLM instead of OpenAI?
A: Absolutely. Set the appropriate provider configuration in .env and ensure the model is reachable from your environment.
Q: How is my data protected when using an external AI service? A: The README warns that data is sent to the chosen model provider. For sensitive data, configure a self‑hosted model or run the pipeline behind your firewall.
Q: What is the purpose of FIXED_DATASOURCE_LUID?
A: It pins the AI tools to a specific Tableau datasource, useful for the custom dashboard‑extension scenario.
Q: How do I add custom MCP tools?
A: Clone the desired repository (e.g., tableau-mcp-experimental), build it with npm, and point mcp_location in dashboard_app.py to the built index.js file.
Q: Where can I get help or contribute? A: Join the Tableau AI Solutions Slack channel, file issues on the GitHub repos, or submit pull requests.
A powerful integration that brings AI functionality to Tableau Server or Tableau Cloud using MCP and LangChain, enabling natural language interactions with the data you trust in Tableau.
This repo is an implementation of tableau-mcp using the MCP tools with LangChain, building on the tableau_langchain_starter_kit.
Before you begin, ensure you have the following:
When using this code, data from Tableau will be sent to an external AI model (by default, OpenAI). For learning and testing, it is strongly recommended to use the Superstore dataset included with Tableau.
If you need to process sensitive or proprietary information, consider configuring the tool to use a local AI model instead of an external service. This approach ensures your data remains within your organisation’s infrastructure and reduces the risk of data exposure.
If you haven't tried Tableau MCP yet I recommend testing it out using desktop applications like Claude Desktop and VSCode. You can find links to my quickstart tutorials below.
git clone https://github.com/tableau/tableau-mcp.git
cd tableau-mcp
From the Install Guide
Install Node.js (tested with 22.15.0 LTS)
npm install
npm run build
git clone https://github.com/TheInformationLab/tableau_mcp_starter_kit.git
cd tableau_mcp_starter_kit
Creating a virtual environment helps isolate project dependencies:
python -m venv .venv
Windows:
.venv\Scripts\activate
macOS/Linux:
source .venv/bin/activate
💡 Tip: You should see (.venv) at the beginning of your command prompt when the virtual environment is active.
pip install -r requirements.txt
If you encounter any installation issues, try upgrading pip first:
pip install --upgrade pip
cp .env_template .env
.env file in your preferred text editor and configure the following variables:# Tableau MCP Server Config
TRANSPORT='stdio'
SERVER='https://my-tableau-server.com'
SITE_NAME='TableauSiteName'
PAT_NAME='Tableau Personal Access Token (PAT) Name'
PAT_VALUE='Tableau Personal Access Token (PAT) Secret Key'
# Tableau MCP Server Optional Configs
DATASOURCE_CREDENTIALS=''
DEFAULT_LOG_LEVEL='debug'
INCLUDE_TOOLS=''
EXCLUDE_TOOLS=''
MAX_RESULT_LIMIT=''
DISABLE_QUERY_DATASOURCE_FILTER_VALIDATION=''
# Local Filepath Config
TABLEAU_MCP_FILEPATH='your/local/filepath/to/tableau-mcp/build/index.js'
# Model Providers
OPENAI_API_KEY='from OpenAI developer portal'
# Langfuse
LANGFUSE_PUBLIC_KEY = 'Public key from https://langfuse.com/'
LANGFUSE_SECRET_KEY = 'Secret key from https://langfuse.com/'
LANGFUSE_HOST = 'https://cloud.langfuse.com'
# Custom MCP Tool Extra Configs
# from: https://github.com/wjsutton/tableau-mcp-experimental
FIXED_DATASOURCE_LUID='unique identifier for a data source found via the graphql metadata API'
⚠️ Security Note: Never commit your .env file to version control. It's already included in .gitignore.
Launch the full web application with dashboard extension support:
python web_app.py
Once running, open your browser and navigate to:
http://localhost:8000You will now be able to ask questions in natural language:
You can also run this web application with dashboard extension support.
Once running, open your Tableau workbook, or the Superstore Dashboard
On a dashboard page, in the bottom left menu, drag a dashboard extension, local extension, and select tableau_langchain.trex from the dashboard_extension folder.
The script dashboard_app.py is configured to use only a single datasource, using custom tools from https://github.com/wjsutton/tableau-mcp-experimental.
To do
git clone https://github.com/wjsutton/tableau-mcp-experimental.git
cd tableau-mcp-experimental
npm install
npm run build
cd ..
cd tableau_mcp_starter_kit
In dashboard_app.py, update: Line 36: mcp_location to the local file path of tableau-mcp-experimental
Find your datasource luid, you can use the utilities/find_datasource_luid.gql to query your Tableau Server / Cloud's Metadata API.
In .env add your datasource luid to the FIXED_DATASOURCE_LUID environment variable.
Run the dashboard_app script
python dashboard_app.py
Verify the app is running, open your browser and navigate to:
http://localhost:8000Once running, open your Tableau workbook, or the Superstore Dashboard
On a dashboard page, in the bottom left menu, drag a dashboard extension, local extension, and select tableau_langchain.trex from the dashboard_extension folder.
This project is licensed under the MIT License - see the LICENSE file for details.
⭐ If you find this project helpful, please consider giving it a star!
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by mindsdb
Enables humans, AI agents, and applications to retrieve highly accurate answers across large‑scale data sources, unifying heterogeneous databases, warehouses, and SaaS platforms.
by mckinsey
Build high-quality data visualization apps quickly using a low-code toolkit that leverages Plotly, Dash, and Pydantic.
by antvis
Offers over 25 AntV chart types for automated chart generation and data analysis, callable via MCP tools, CLI, HTTP, SSE, or streamable transports.
by reading-plus-ai
A versatile tool that enables interactive data exploration through prompts, CSV loading, and script execution.
by Canner
Provides a semantic engine that lets MCP clients and AI agents query enterprise data with contextual understanding, precise calculations, and built‑in governance.
by surendranb
Provides natural‑language access to Google Analytics 4 data via MCP, exposing over 200 dimensions and metrics for Claude, Cursor and other compatible clients.
by ergut
Provides secure, read‑only access to BigQuery datasets, allowing large language models to query and analyze data through a standardized interface.
by isaacwasserman
Provides an interface for LLMs to visualize data using Vega‑Lite syntax, supporting saving of data tables and rendering visualizations as either a full Vega‑Lite specification (text) or a base64‑encoded PNG image.
by gomarble-ai
Provides seamless integration of the Google Ads API with Model Context Protocol clients, handling OAuth 2.0 authentication, automatic token refresh, GAQL query execution, account management, and keyword‑research capabilities.