Model Context Protocol · Server
Expose six OphyAI tools to any MCP-compatible agent: Claude Desktop, Claude Code, Cursor, or your own LLM stack. Real-time interview answers, ATS resume scoring, JD-tailored resume rewriting, STAR behavioral stories, cover-letter generation, and job search — installed once, available everywhere your agent runs.
TL;DR — The Quick Answer
OphyAI MCP Server exposes six interview, resume, and job-search tools to any Model Context Protocol–compatible agent (Claude Desktop, Claude Code, Cursor, custom stacks). The design doc at /integrations/mcp-server.md and the OpenAPI spec at /integrations/openapi.yaml are real today; the official @ophyai/mcp-server npm package is on the v0.1 roadmap. Builders can implement against the OpenAPI immediately. Free with any OphyAI account; transport is stdio at v0.1 and streamable HTTP at v1.
Real prompts you can paste in today. The agent calls OphyAI's APIs in the background.
Wire the OphyAI MCP server into Claude Desktop, then ask Claude for STAR stories, ATS scores, or live interview answers — Claude calls OphyAI tools the same way it calls filesystem or git tools.
Example prompt
"Use the OphyAI tools to give me 5 STAR stories grounded in my resume at ~/Documents/resume.md, scoped to the JD I'll paste next."
In a Claude Code session, ask the agent to tailor a resume Markdown file in your repo against a JD — and write the result back to disk. The MCP resume_tailor tool keeps the truth-preserving flag on by default.
Example prompt
"Use ophyai.resume_tailor to tailor ./me/resume.md against ./me/jds/acme-senior-be.md and write the result to ./me/resume.acme.md. Don't fabricate experience."
Use the MCP server inside Cursor to keep a Markdown story library up-to-date. Each new role you describe generates a STAR-format entry the agent can pull from later.
Example prompt
"For each role in my career-history.md, generate a STAR story for the competency 'leading through ambiguity' using ophyai.star_answer. Append to ./me/stories/leadership.md."
Inside a custom LangChain or Vercel AI SDK agent, register the OphyAI MCP server alongside your other tools (web search, calendar, email). The agent now has interview superpowers as a side-effect.
Example prompt
"ophyai.interview_answer({ question, resume_text, job_description, tone: 'star' })"
Once v0.2 ships MCP resources, your agent can fetch ophyai://resumes/{id} directly — no more pasting plaintext. The agent receives the full resume body as a first-class MCP resource.
Example prompt
"Read ophyai://resumes/primary, then call ophyai.resume_analyze with that text and the JD I paste next."
Wire the MCP server into your shell agent of choice (Claude Code, Codename, your own). Search jobs, score listings against your resume, and write a tracker entry — all from the terminal.
Example prompt
"Use ophyai.jobs_search for senior backend roles, then ophyai.resume_analyze to score my resume against the top 5. Write a markdown report to ./jobs/today.md."
Free, no credit card. Get an account at app.ophyai.com — the MCP server uses your account's API key.
Once published, install via npx: `npx -y @ophyai/mcp-server@latest`. Today, you can clone the OpenAPI-driven reference implementation listed in /integrations/mcp-server.md.
In claude_desktop_config.json, .mcp.json, or your client's equivalent, add an entry that runs the server with OPHYAI_API_KEY set to your OphyAI key.
Restart Claude Desktop / Claude Code / Cursor. The OphyAI MCP tools (interview_answer, resume_analyze, etc.) should appear in the tool list.
Ask: "Use the ophyai.resume_analyze tool to score this resume against this JD." If the agent calls the tool and returns a score, you're wired up.
Compose with other MCP servers — filesystem, git, web search. The OphyAI tools become a normal part of your agent's capability surface.
The agent has access to these OphyAI API endpoints. Full OpenAPI spec at /integrations/openapi.yaml.
| Action | Purpose |
|---|---|
| interview_answer | Real-time interview answer from question + resume/JD context. |
| star_answer | STAR-format behavioral story for a target role. |
| resume_analyze | ATS score + missing keywords + prioritized fixes. |
| resume_tailor | Rewrite a resume for a specific JD (preserve_truth on by default). |
| cover_letter_generate | Tailored cover letter generation. |
| jobs_search | Search the OphyAI job index by query, location, seniority, recency. |
Free with any OphyAI account, including the no-card free tier. Programmatic API usage is metered against your OphyAI credits — Resume Builder calls cost 5 credits, Interview Copilot answers cost 15. The MCP server itself is open and self-hostable; no per-seat fee.
Copy/paste-ready example.
// claude_desktop_config.json (or any MCP client config)
{
"mcpServers": {
"ophyai": {
"command": "npx",
"args": ["-y", "@ophyai/mcp-server@latest"],
"env": {
"OPHYAI_API_KEY": "ophy_sk_live_••••"
}
}
}
}
// Once installed and the client is restarted, the agent can call:
//
// ophyai.interview_answer({
// question: "Tell me about a time you led through ambiguity.",
// resume_text: "...",
// job_description: "...",
// tone: "star"
// })
//
// ophyai.resume_analyze({ resume_text, job_description })
// ophyai.jobs_search({ query: "senior PM", country: "us", remote: true })
//
// Full tool inventory: /integrations/mcp-server.md
// OpenAPI spec: /integrations/openapi.yamlThe design doc and tool inventory at /integrations/mcp-server.md are real and stable enough to wire into your own MCP server today. The official @ophyai/mcp-server npm package is on the v0.1 roadmap. In the meantime, builders consume the OpenAPI spec at /integrations/openapi.yaml directly and front it with their own MCP transport.
MCP is an open protocol from Anthropic for connecting LLM agents to tools and data sources. Any MCP-compatible client — Claude Desktop, Claude Code, Cursor, custom agents — can call MCP tools the same way, without bespoke integration code per server. Think of it as USB-C for AI agents.
Anything that speaks MCP. The first targets are Claude Desktop, Claude Code, Cursor, Continue, and custom Anthropic / OpenAI agent stacks. The v0.1 server will ship as stdio (for desktop installs); v1 adds streamable HTTP transport at mcp.ophyai.com for hosted agents.
Same six capabilities (interview answers, STAR stories, resume analysis, resume tailoring, cover letters, job search), three different transports. Use MCP when you're building a custom agent or live inside a Claude/Cursor-style developer tool. Use the ChatGPT GPT inside ChatGPT. Use the Microsoft 365 Copilot agent inside Teams/Word/Outlook.
Yes — the v0.2 roadmap adds MCP resources for OphyAI-stored artifacts: ophyai://resumes/{id}, ophyai://applications/{id}, and ophyai://jobs/{id}. That lets agents fetch the actual resume / JD body without round-tripping through the user.
Yes. The OpenAPI spec at /integrations/openapi.yaml gives you exact request/response shapes for every tool. Write a thin MCP server in Python or TypeScript that maps each MCP tool call to the corresponding HTTPS endpoint. We'll happily review and link to community implementations — email api@ophyai.com.
Sign up free, then read /integrations/mcp-server.md for the full tool inventory and roadmap. The v0.1 npm package is coming soon; the OpenAPI spec is consumable today.