by tacticlaunch
Provides AI assistants with natural‑language access to Linear, allowing retrieval, creation, and updating of issues, projects, and teams.
Mcp Linear bridges AI assistants and the Linear project‑management platform. By implementing the Model Context Protocol (MCP), it lets assistants understand natural‑language commands and translate them into GraphQL operations against Linear.
npx -y @smithery/cli install @tacticlaunch/mcp-linear --client cursor # for Cursor
npx -y @smithery/cli install @tacticlaunch/mcp-linear --client claude # for Claude Desktop
Alternatively, install globally:
npm install -g @tacticlaunch/mcp-linear
mcp-linear --token YOUR_LINEAR_API_TOKEN
mcp.json
(or equivalent) file, specifying the token via the LINEAR_API_TOKEN
environment variable.TOOLS.md
).Q: Which AI clients are supported?
A: Cursor, Claude Desktop, Claude VSCode extension, and any client that reads the standard MCP mcp.json
configuration.
Q: Do I need to run the server locally? A: Yes. The server runs as a Node.js process on your machine and communicates with the client over the MCP protocol.
Q: What Node version is required? A: Node.js v18 or newer.
Q: How is the API token protected?
A: The token is passed via the LINEAR_API_TOKEN
environment variable; never hard‑code it in source files.
Q: Where can I see the full list of available tools?
A: In the repository’s TOOLS.md
file.
Q: Can I contribute new features?
A: Yes. See DEVELOPMENT.md
for guidance on local development and publishing.
A Model Context Protocol (MCP) server implementation for the Linear GraphQL API that enables AI assistants to interact with Linear project management systems.
MCP Linear bridges the gap between AI assistant and Linear (project management tool) by implementing the MCP protocol. This allows to:
Once connected, you can use prompts like:
To use MCP Linear, you'll need a Linear API token. Here's how to get one:
MCP Linear Integration
)npx -y @smithery/cli install @tacticlaunch/mcp-linear --client cursor
npx -y @smithery/cli install @tacticlaunch/mcp-linear --client claude
Add the following to your MCP settings file:
{
"mcpServers": {
"linear": {
"command": "npx",
"args": ["-y", "@tacticlaunch/mcp-linear"],
"env": {
"LINEAR_API_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
~/.cursor/mcp.json
~/Library/Application Support/Claude/claude_desktop_config.json
~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
~/.config/gomcp/config.yaml
Prerequisites
# Install globally
npm install -g @tacticlaunch/mcp-linear
# Or clone and install locally
git clone https://github.com/tacticlaunch/mcp-linear.git
cd mcp-linear
npm install
npm link # Makes the package available globally
Run the server with your Linear API token:
mcp-linear --token YOUR_LINEAR_API_TOKEN
Or set the token in your environment and run without arguments:
export LINEAR_API_TOKEN=YOUR_LINEAR_API_TOKEN
mcp-linear
See TOOLS.md for a complete list of available tools and planned features.
See DEVELOPMENT.md for more information on how to develop locally.
tacticlaunch/cursor-memory-bank - If you are a developer seeking to enhance your workflow with Cursor, consider giving it a try.
This project is licensed under the MIT License - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "linear": { "command": "npx", "args": [ "-y", "@tacticlaunch/mcp-linear" ], "env": { "LINEAR_API_TOKEN": "<YOUR_TOKEN>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.