by runebookai
A desktop application that lets anyone chat with local or remote LLMs, schedule hourly or daily prompts, and connect to thousands of Model Context Protocol servers without writing code.
Tome provides a user‑friendly interface for interacting with both cloud‑based and locally hosted large language models. It integrates Model Context Protocol servers so models can call tools such as search engines, file systems, or custom APIs directly from the chat window.
uvx mcp-server-fetch
), and enable the ones you need.Q: Do I need to be a developer to use Tome? A: No. The UI handles model connections and MCP server setup without any code.
Q: Can I run everything offline? A: Yes. Use Ollama or Qwen3 locally and install only local MCP servers.
Q: Which operating systems are supported? A: macOS and Windows are currently supported; Linux support is planned.
Q: How do I add a custom MCP server?
A: In the MCP tab, enter the command to launch the server (e.g., uvx mcp-server-fetch
) and save. The server will appear in the list for activation.
Q: Where can I find community help? A: Join the Discord community linked in the README for tips, troubleshooting, and feature requests.
Tome is a desktop app that lets anyone harness the magic of LLMs and MCP. Download Tome, connect any local or remote LLM and hook it up to thousands of MCP servers to create your own magical AI-powered spellbook.
What is MCP? MCP stands for Model Context Protocol and lets your LLM access tools - like search engines, your filesystem, or APIs like Scryfall or Atlassian
🫥 Want it to be 100% local, 100% private? Use Ollama and Qwen3 with only local MCP servers to cast spells in your own pocket universe. ⚡ Want state of the art cloud models with the latest remote MCP servers? You can have that too. It's all up to you!
🏗️ This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!
https://github.com/user-attachments/assets/0775d100-3eba-4219-9e2f-360a01f28cce
uvx mcp-server-fetch
into the server field).We want to make local LLMs and MCP accessible to everyone. We're building a tool that allows you to be creative with LLMs, regardless of whether you're an engineer, tinkerer, hobbyist, or anyone in between.
We've gotten a lot of amazing feedback in the last few weeks since releasing Tome but we've got big plans for the future. We want to break LLMs out of their chatbox, and we've got a lot of features coming to help y'all do that.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by GLips
Provides Figma layout and styling information to AI coding agents, enabling one‑shot implementation of designs in any framework.
by idosal
Provides a remote Model Context Protocol server that transforms any public GitHub repository into an up‑to‑date documentation hub, enabling AI assistants to fetch live code and docs, dramatically reducing hallucinations.