by elie222
Organizes email inbox, drafts replies in the user's tone, tracks follow‑ups, and provides analytics to achieve inbox zero quickly.
Inbox Zero is an AI‑powered personal assistant that helps you manage your email faster. It automatically categorizes senders, suggests replies in your own style, tracks messages that need a response, and offers analytics on your email habits.
pnpm
/Turbo..env
file.Q: Do I need to pay for any services?
A: The core app is open source and free. Premium features require a paid subscription or making yourself an admin in the .env
file.
Q: Which email providers are supported? A: Google Gmail (via OAuth) and Microsoft Outlook/Exchange are supported.
Q: Can I run the app locally without Docker?
A: Yes. After cloning the repo, run pnpm install
, configure .env
, and start with turbo dev
or pnpm run dev
.
Q: What LLMs can I use? A: OpenAI, Anthropic, AWS Bedrock (Anthropic), Google Gemini, Groq, and local Ollama models are supported. Set the appropriate env vars.
Q: How do I receive real‑time email notifications?
A: Set up Google Pub/Sub, create a topic and subscription, and configure GOOGLE_PUBSUB_TOPIC_NAME
and GOOGLE_PUBSUB_VERIFICATION_TOKEN
in .env
. The app provides a webhook endpoint for push notifications.
Q: Is there a Docker image available?
A: Yes. Build with docker build --build-arg NEXT_PUBLIC_BASE_URL="https://your-domain.com" -t inbox-zero -f docker/Dockerfile.prod .
and run with the required environment variables.
To help you spend less time in your inbox, so you can focus on what matters.
Learn more in our docs.
![]() |
![]() |
---|---|
AI Assistant | Reply Zero |
![]() |
![]() |
Gmail client | Bulk Unsubscriber |
To request a feature open a GitHub issue, or join our Discord.
We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.
Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.
Make sure you have the above installed before starting.
The external services that are required are (detailed setup instructions below):
Create your own .env
file from the example supplied:
cp apps/web/.env.example apps/web/.env
cd apps/web
pnpm install
Set the environment variables in the newly created .env
. You can see a list of required variables in: apps/web/env.ts
.
The required environment variables:
AUTH_SECRET
-- can be any random string (try using openssl rand -hex 32
for a quick secure random string)
EMAIL_ENCRYPT_SECRET
-- Secret key for encrypting OAuth tokens (try using openssl rand -hex 32
for a secure key)
EMAIL_ENCRYPT_SALT
-- Salt for encrypting OAuth tokens (try using openssl rand -hex 16
for a secure salt)
NEXT_PUBLIC_BASE_URL
-- The URL where your app is hosted (e.g., http://localhost:3000
for local development or https://yourdomain.com
for production).
INTERNAL_API_KEY
-- A secret key for internal API calls (try using openssl rand -hex 32
for a secure key)
UPSTASH_REDIS_URL
-- Redis URL from Upstash. (can be empty if you are using Docker Compose)
UPSTASH_REDIS_TOKEN
-- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300
or lower. See Vercel limits for different plans here.
GOOGLE_CLIENT_ID
-- Google OAuth client ID. More info hereGOOGLE_CLIENT_SECRET
-- Google OAuth client secret. More info hereGo to Google Cloud. Create a new project if necessary.
Create new credentials:
If the banner shows up, configure consent screen (if not, you can do this later)
Get Started
.External
Create
.Create new credentials:
+Create Credentials
button. Choose OAuth Client ID.Application Type
, Choose Web application
http://localhost:3000
Authorized redirect URIs
enter:http://localhost:3000/api/auth/callback/google
http://localhost:3000/api/google/linking/callback
Create
.Update .env file:
GOOGLE_CLIENT_ID
GOOGLE_CLIENT_SECRET
Update scopes
Data Access
in the left sidebar (or click link above)Add or remove scopes
Manually add scopes
box:https://www.googleapis.com/auth/userinfo.profile
https://www.googleapis.com/auth/userinfo.email
https://www.googleapis.com/auth/gmail.modify
https://www.googleapis.com/auth/gmail.settings.basic
https://www.googleapis.com/auth/contacts
Update
Save
in the Data Access page.Add yourself as a test user
Test users
section, click +Add users
Save
MICROSOFT_CLIENT_ID
-- Microsoft OAuth client IDMICROSOFT_CLIENT_SECRET
-- Microsoft OAuth client secretGo to Microsoft Azure Portal. Create a new Azure Active Directory app registration:
Navigate to Azure Active Directory
Go to "App registrations" in the left sidebar or search it in the searchbar
Click "New registration"
http://localhost:3000/api/auth/callback/microsoft
http://localhost:3000/api/outlook/linking/callback
Get your credentials:
MICROSOFT_CLIENT_ID
MICROSOFT_CLIENT_SECRET
Configure API permissions:
In the "Manage" menu click "API permissions" in the left sidebar
Click "Add a permission"
Select "Microsoft Graph"
Select "Delegated permissions"
Add the following permissions:
Click "Add permissions"
Click "Grant admin consent" if you're an admin
Update your .env file with the credentials:
MICROSOFT_CLIENT_ID=your_client_id_here
MICROSOFT_CLIENT_SECRET=your_client_secret_here
You need to set an LLM, but you can use a local one too:
For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:
OLLAMA_BASE_URL=http://localhost:11434/api
NEXT_PUBLIC_OLLAMA_MODEL=phi3
Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api
as the base URL. You might also need to set OLLAMA_HOST
to 0.0.0.0
in the Ollama configuration file.
You can select the model you wish to use in the app on the /settings
page of the app.
If you are using local ollama, you can set it to be default:
DEFAULT_LLM_PROVIDER=ollama
If this is the case you must also set the ECONOMY_LLM_PROVIDER
environment variable.
We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.
You can run Postgres & Redis locally using docker-compose
docker-compose up -d # -d will run the services in the background
To run the migrations:
pnpm prisma migrate dev
To run the app locally for development (slower):
pnpm run dev
Or from the project root:
turbo dev
To build and run the app locally in production mode (faster):
pnpm run build
pnpm start
Open http://localhost:3000 to view the app in your browser.
Many features are available only to premium users. To upgrade yourself, make yourself an admin in the .env
: ADMINS=hello@gmail.com
Then upgrade yourself at: http://localhost:3000/admin.
Follow instructions here.
Set env var GOOGLE_PUBSUB_TOPIC_NAME
.
When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN
or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN
where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN
in your .env
file to be the value of TOKEN
.
To run in development ngrok can be helpful:
ngrok http 3000
# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000
And then update the webhook endpoint in the Google PubSub subscriptions dashboard.
To start watching emails visit: /api/watch/all
Set a cron job to run these: The Google watch is necessary. Others are optional.
"crons": [
{
"path": "/api/watch/all",
"schedule": "0 1 * * *"
},
{
"path": "/api/resend/summary/all",
"schedule": "0 16 * * 1"
},
{
"path": "/api/reply-tracker/disable-unused-auto-draft",
"schedule": "0 3 * * *"
}
]
Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json
. Open to PRs if you find a fix for that.
When building the Docker image, you must specify your NEXT_PUBLIC_BASE_URL
as a build argument. This is because Next.js embeds NEXT_PUBLIC_*
variables at build time, not runtime.
# For production with your custom domain
docker build \
--build-arg NEXT_PUBLIC_BASE_URL="https://your-domain.com" \
-t inbox-zero \
-f docker/Dockerfile.prod .
# For local development (default)
docker build -t inbox-zero -f docker/Dockerfile.prod .
After building, run the container with your runtime secrets:
docker run -p 3000:3000 \
-e DATABASE_URL="your-database-url" \
-e AUTH_SECRET="your-auth-secret" \
-e GOOGLE_CLIENT_ID="your-google-client-id" \
-e GOOGLE_CLIENT_SECRET="your-google-client-secret" \
# ... other runtime environment variables
inbox-zero
Important: If you need to change NEXT_PUBLIC_BASE_URL
, you must rebuild the Docker image. It cannot be changed at runtime.
For more detailed Docker build instructions and security considerations, see docker/DOCKER_BUILD_GUIDE.md.
You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.
ARCHITECTURE.md explains the architecture of the project (LLM generated).
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by Skyvern-AI
Automates browser‑based workflows by leveraging large language models and computer‑vision techniques, turning natural‑language prompts into fully functional web interactions without writing custom scripts.
by ahujasid
Enables Claude AI to control Blender for prompt‑assisted 3D modeling, scene creation, and manipulation via a socket‑based Model Context Protocol server.
by PipedreamHQ
Connect APIs quickly with a free, hosted integration platform that enables event‑driven automations across 1,000+ services and supports custom code in Node.js, Python, Go, or Bash.
by grab
Enables Cursor AI to read and programmatically modify Figma designs through a Model Context Protocol integration.
by ahujasid
Enables Claude AI to control Ableton Live in real time, allowing AI‑driven creation, editing, and playback of tracks, clips, instruments, and effects through a socket‑based server.
by leonardsellem
Provides tools and resources to enable AI assistants to manage and execute n8n workflows via natural language commands.
by GongRzhe
Provides a Model Context Protocol server that enables AI assistants to send, read, search, and organize Gmail messages, supporting attachments, label and filter management, and automatic OAuth2 authentication.
by mario-andreschak
A unified platform that manages AI models, MCP servers, and complex workflows, offering secure key storage, visual flow building, and an interactive chat UI.
by tevonsb
Provides a Model Context Protocol server that exposes Home Assistant functionality through a comprehensive REST and SSE API, enabling natural‑language control of smart‑home devices.