by movstox
Provides tools to start, stop, and query Toggl time entries through a Model Context Protocol server.
A lightweight MCP server that lets you interact with the Toggl Track API, enabling start/stop of timers, fetching the current entry, and listing workspaces directly from an MCP client.
uv sync
.TOGGL_API_TOKEN
environment variable in the MCP server configuration.serverConfig
field below).start_tracking
, stop_tracking
, show_current_time_entry
, list_workspaces
) from any MCP‑compatible client.Q: Do I need a Toggl account? A: Yes, a valid Toggl Track account and its API token are required.
Q: Can I specify a workspace or project when starting tracking? A: Both are optional; if omitted the default workspace/project is used.
Q: How is the server started?
A: Through the MCP configuration – the server runs as a stdio process launched by the uv
command.
Q: What language is the server written in?
A: Python, using the uv
tool for dependency management.
Q: Is there any authentication besides the API token?
A: No additional authentication is needed; the token is passed via the TOGGL_API_TOKEN
environment variable.
A Model Context Protocol (MCP) server that provides tools for interacting with Toggl time tracking.
start_tracking
title
(string): Title/description of the task to trackworkspace_id
(integer): Workspace ID (optional, uses default if not provided)project_id
(integer): Project ID (optional)tags
(string[]): List of tags (optional)stop_tracking
list_workspaces
show_current_time_entry
This server uses the Toggl Track API v9. The following endpoints are utilized:
GET /me
- Get user informationGET /workspaces
- List workspacesGET /me/time_entries/current
- Get current running time entryPOST /workspaces/{workspace_id}/time_entries
- Start time trackingPATCH /workspaces/{workspace_id}/time_entries/{time_entry_id}/stop
- Stop time trackinguv
:
cd lazy-toggl-mcp
uv sync
Add the following configuration to your MCP settings file:
{
"mcpServers": {
"lazy-toggl-mcp": {
"autoApprove": [],
"disabled": false,
"timeout": 60,
"type": "stdio",
"transportType": "stdio",
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/lazy-toggl-mcp",
"python",
"server.py"
],
"env": {
"TOGGL_API_TOKEN": "your-actual-api-token-here"
}
}
}
}
Important: Replace /path/to/lazy-toggl-mcp
with the actual path to this project and your-actual-api-token-here
with your real Toggl API token.
lazy-toggl-mcp/
├── src/
│ └── toggl_server/
│ ├── __init__.py # Package initialization
│ ├── main.py # MCP server implementation (new structure)
│ ├── models.py # Data models and type definitions
│ ├── toggl_api.py # Toggl API client
│ └── utils.py # Utility functions
├── main.py # CLI interface for testing
├── server.py # Main MCP server entry point
├── pyproject.toml # Project configuration and dependencies
├── README.md # This file
├── uv.lock # Dependency lock file
├── .gitignore # Git ignore patterns
└── .python-version # Python version specification
MIT License - feel free to modify and use as needed.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "lazy-toggl-mcp": { "command": "uv", "args": [ "run", "--directory", "/path/to/lazy-toggl-mcp", "python", "server.py" ], "env": { "TOGGL_API_TOKEN": "<YOUR_API_TOKEN>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by Skyvern-AI
Automates browser‑based workflows by leveraging large language models and computer‑vision techniques, turning natural‑language prompts into fully functional web interactions without writing custom scripts.
by ahujasid
Enables Claude AI to control Blender for prompt‑assisted 3D modeling, scene creation, and manipulation via a socket‑based Model Context Protocol server.
by PipedreamHQ
Connect APIs quickly with a free, hosted integration platform that enables event‑driven automations across 1,000+ services and supports custom code in Node.js, Python, Go, or Bash.
by elie222
Organizes email inbox, drafts replies in the user's tone, tracks follow‑ups, and provides analytics to achieve inbox zero quickly.
by grab
Enables Cursor AI to read and programmatically modify Figma designs through a Model Context Protocol integration.
by ahujasid
Enables Claude AI to control Ableton Live in real time, allowing AI‑driven creation, editing, and playback of tracks, clips, instruments, and effects through a socket‑based server.
by leonardsellem
Provides tools and resources to enable AI assistants to manage and execute n8n workflows via natural language commands.
by GongRzhe
Provides a Model Context Protocol server that enables AI assistants to send, read, search, and organize Gmail messages, supporting attachments, label and filter management, and automatic OAuth2 authentication.
by mario-andreschak
A unified platform that manages AI models, MCP servers, and complex workflows, offering secure key storage, visual flow building, and an interactive chat UI.