by kiwamizamurai
Provides MCP server implementation for Kibela API integration, enabling LLMs to search, retrieve, and manage Kibela notes, groups, folders, users, and attachments.
Mcp Kibela Server connects Large Language Models to Kibela, a collaborative knowledge‑base platform. It exposes Kibela functionality through a set of MCP tools, allowing automated agents to query notes, fetch content, manage groups/folders, like/unlike notes, and retrieve user information.
npx
:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["-y", "@kiwamizamurai/mcp-kibela-server"],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
~/.cursor/mcp.json
(or the appropriate MCP client config).KIBELA_TEAM
and KIBELA_TOKEN
are set with your Kibela team name and API token.kibela_search_notes
, kibela_get_note_content
) from your LLM workflow or CLI.npm install
, and execute the built index with Node or use the provided Docker image.http://localhost:3000/sse
).include_image_data
to true
to embed image data URLs; otherwise only URLs are returned.limit
parameter.MCP server implementation for Kibela API integration, enabling LLMs to interact with Kibela content.
[!TIP] This extension performs GraphQL schema introspection using the buildClientSchema, getIntrospectionQuery, and printSchema functions from the graphql package to reverse engineer Kibela's API. For more details, see her
KIBELA_TEAM
: Your Kibela team name (required)KIBELA_TOKEN
: Your Kibela API token (required)Add to your ~/.cursor/mcp.json
:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["-y", "@kiwamizamurai/mcp-kibela-server"],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
If you want to use docker instead
{
"mcpServers": {
"kibela": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"KIBELA_TEAM",
"-e",
"KIBELA_TOKEN",
"ghcr.io/kiwamizamurai/mcp-kibela-server:latest"
],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
Search Kibela notes with given query
query
(string): Search querycoediting
(boolean, optional): Filter by co-editing statusisArchived
(boolean, optional): Filter by archive statussortBy
(string, optional): Sort by (RELEVANT, CONTENT_UPDATED_AT)userIds
(string[], optional): Filter by user IDsfolderIds
(string[], optional): Filter by folder IDsGet your latest notes from Kibela
limit
(number, optional): Number of notes to fetch (default: 15)Get content and comments of a specific note
id
(string): Note IDinclude_image_data
(boolean, optional): Whether to include image data URLs in the response (default: false)Get list of accessible groups
Get folders in a group
groupId
(string): Group IDparentFolderId
(string, optional): Parent folder ID for nested foldersGet notes in a group that are not attached to any folder
groupId
(string): Group IDGet notes in a folder
folderId
(string): Folder IDlimit
(number, optional): Number of notes to fetch (default: 100)Get list of users
Like a note
noteId
(string): Note IDUnlike a note
noteId
(string): Note IDGet your recently viewed notes
limit
(number, optional): Number of notes to fetch (max 15)Get note content by its path or URL
path
(string): Note path (e.g. '/group/folder/note') or full Kibela URL (e.g. 'https://team.kibe.la/notes/123')include_image_data
(boolean, optional): Whether to include image data URLs in the response (default: false)npm install
For local development, update your ~/.cursor/mcp.json
:
{
"mcpServers": {
"kibela": {
"command": "node",
"args": ["path/to/mcp-kibela-server/dist/src/index.js"],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
npx @modelcontextprotocol/inspector node ./dist/src/index.js
and set environemtns
Build and run locally:
docker build -t mcp-kibela-server .
Then use this configuration:
{
"mcpServers": {
"kibela": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"KIBELA_TEAM",
"-e",
"KIBELA_TOKEN",
"mcp-kibela-server"
],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
For SSE transport, ensure the server URL is set to: http://localhost:3000/sse
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "kibela": { "command": "npx", "args": [ "-y", "@kiwamizamurai/mcp-kibela-server" ], "env": { "KIBELA_TEAM": "YOUR_TEAM_NAME", "KIBELA_TOKEN": "YOUR_TOKEN" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.