by wildfly-extras
Enable generative AI capabilities for monitoring and managing WildFly servers through integrated tooling.
WildFly MCP provides a set of tools that connect WildFly servers with generative‑AI chat interfaces, allowing administrators to interact with server monitoring and management functions using natural language.
wildfly-mcp-server/
) alongside your WildFly instance.wildfly-chat-bot/
) to expose a conversational interface that communicates with the MCP server.container-images/
) for quick Kubernetes/OpenShift deployment, which bundle both the server and chat bot.mcp-stdio-sse-gateway/
) to translate protocols.wait-mcp-server/
) can simulate delays.Q: Do I need a specific LLM provider? A: The MCP server is agnostic; you can connect any LLM that can communicate over the defined protocol.
Q: Is there a Kubernetes operator? A: Not currently, but the provided container images can be deployed with standard manifests or Helm charts.
Q: How do I secure the communication? A: Use TLS in the underlying MCP transport and configure authentication tokens in the chat bot.
Q: Can I extend the server with custom commands? A: Yes, the MCP server is Java‑based; you can add new handlers to expose additional WildFly management actions.
Q: What is the purpose of the Wait MCP Server? A: It introduces a programmable delay, useful for pacing multi‑step AI interactions or simulating long‑running tasks.
This project aims to define tooling allowing WildFly users to benefenit from the Generative AI capabilities when monitoring and managing WildFly servers.
WildFly MCP Server: A WildFly MCP server to integrate with your AI chatbot in order to interact with WildFly server using natural language.
WildFly Chat Bot: A WildFly Chat Bot to interact with WildFly servers. This AI chatbot allows to also integrate MCP servers (STDIO and SSE protocol).
Container Images: Container images for mcp server and the chat bot (that contains both the chat bot and the mcp server. Ready to interact with your WildFly servers on the cloud). Example of OpenShift deployment is provided.
MCP STDIO to SEE protocol gateway: A Java gateway allowing to integrate SSE MCP servers in chat applications that only support STDIO protocol.
Wait MCP Server: A simple MCP server that allows LLM to wait for some seconds. Can be useful in some workflow.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.