by jobswithgpt
Enables AI models to perform job‑search queries through a lightweight MCP server that can be accessed by Claude Desktop or OpenAI tools.
Mcp provides a simple MCP server implementation that exposes job‑search functionality. The server can be called directly by AI models (e.g., Claude Desktop, OpenAI's GPT‑4.1‑mini) to retrieve relevant job listings based on natural‑language queries.
uv run mcp install server.py
jobswithgpt) and URL (https://jobswithgpt.com/mcp/) in the tool definition for Claude or OpenAI.find jobs for python devs in sf through the AI model’s tool call. The model receives formatted job results.search_jobs).Q: Which Python version is required? A: Python 3.12 or newer.
Q: Do I need an API key?
A: The public demo at jobswithgpt.com/mcp/ does not require one, but self‑hosted instances may.
Q: Can I add more job sources?
A: Yes, extend server.py to query additional job boards and return results in the same format.
Q: Is there a Node.js installation method? A: The official guide uses UV for Python; a Node wrapper is not provided yet.
A dynamic MCP server management service that creates, runs, and manages Model Context Protocol (MCP) servers dynamically. This service itself functions as an MCP server and launches/manages other MCP servers as child processes, enabling a flexible MCP ecosystem.
...
npx)Locate (or create) the Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.jsonInsert this JSON:
{
"mcpServers": {
"jobswithgpt": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://jobswithgpt.com/mcp"
]
}
}
}
Quit Claude Desktop completely and reopen it.
Your new server jobswithgpt should appear in the paperclip menu under Tools.
OpenAI can directly use the server hosted MCP server (https://jobswithgpt.com/mcp)
import asyncio
from agents import Agent, Runner
from agents.mcp.server import MCPServerStreamableHttp
import json
MCP_URL = "https://jobswithgpt.com/mcp" # your FastMCP streamable HTTP endpoint
async def main():
async with MCPServerStreamableHttp(params={"url": MCP_URL}, name="jobswithgpt") as server:
agent = Agent(
name="jobs-mcp-local",
mcp_servers=[server],
instructions=(
"Use the MCP server tools. First call location_autocomplete to get a geonameid "
"for 'Seattle', then call search_jobs with keywords=['python'] and that geonameid."
),
)
res = await Runner.run(agent, "Find machine learning jobs in san francisco.")
print(res.final_output)
if __name__ == "__main__":
asyncio.run(main())
Here are some Python developer job opportunities in San Francisco:
1. Software Engineer - Backend, Product Engineering at Baton
[Apply here](https://job-boards.greenhouse.io/baton/jobs/4011483007)
2. Senior Backend Engineer at Stellic
[Apply here](https://job-boards.greenhouse.io/stellic/jobs/4705805007)
3. Software Engineer - Backend at Julius AI
[Apply here](https://jobs.ashbyhq.com/julius/75f8ef44-4fa4-46fa-b416-c7b697078eca)
etc
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by perplexityai
Enables Claude and other MCP‑compatible applications to perform real‑time web searches through the Perplexity (Sonar) API without leaving the MCP ecosystem.
by MicrosoftDocs
Provides semantic search and fetch capabilities for Microsoft official documentation, returning content in markdown format via a lightweight streamable HTTP transport for AI agents and development tools.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by spences10
Provides unified access to multiple search engines, AI response tools, and content processing services through a single Model Context Protocol server.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.