Skip to main content
logo

Model Context Protocol · Server

OphyAI MCP Server — Model Context ProtocolFor AI agents that build, prep, and apply

Expose six OphyAI tools to any MCP-compatible agent: Claude Desktop, Claude Code, Cursor, or your own LLM stack. Real-time interview answers, ATS resume scoring, JD-tailored resume rewriting, STAR behavioral stories, cover-letter generation, and job search — installed once, available everywhere your agent runs.

TL;DR — The Quick Answer

OphyAI MCP Server exposes six interview, resume, and job-search tools to any Model Context Protocol–compatible agent (Claude Desktop, Claude Code, Cursor, custom stacks). The design doc at /integrations/mcp-server.md and the OpenAPI spec at /integrations/openapi.yaml are real today; the official @ophyai/mcp-server npm package is on the v0.1 roadmap. Builders can implement against the OpenAPI immediately. Free with any OphyAI account; transport is stdio at v0.1 and streamable HTTP at v1.

What you can do

Real prompts you can paste in today. The agent calls OphyAI's APIs in the background.

Interview prep inside Claude Desktop

Wire the OphyAI MCP server into Claude Desktop, then ask Claude for STAR stories, ATS scores, or live interview answers — Claude calls OphyAI tools the same way it calls filesystem or git tools.

Example prompt

"Use the OphyAI tools to give me 5 STAR stories grounded in my resume at ~/Documents/resume.md, scoped to the JD I'll paste next."

Resume tailoring inside Claude Code

In a Claude Code session, ask the agent to tailor a resume Markdown file in your repo against a JD — and write the result back to disk. The MCP resume_tailor tool keeps the truth-preserving flag on by default.

Example prompt

"Use ophyai.resume_tailor to tailor ./me/resume.md against ./me/jds/acme-senior-be.md and write the result to ./me/resume.acme.md. Don't fabricate experience."

Behavioral story library in Cursor

Use the MCP server inside Cursor to keep a Markdown story library up-to-date. Each new role you describe generates a STAR-format entry the agent can pull from later.

Example prompt

"For each role in my career-history.md, generate a STAR story for the competency 'leading through ambiguity' using ophyai.star_answer. Append to ./me/stories/leadership.md."

Build your own interview agent

Inside a custom LangChain or Vercel AI SDK agent, register the OphyAI MCP server alongside your other tools (web search, calendar, email). The agent now has interview superpowers as a side-effect.

Example prompt

"ophyai.interview_answer({ question, resume_text, job_description, tone: 'star' })"

Hand the agent the actual resume

Once v0.2 ships MCP resources, your agent can fetch ophyai://resumes/{id} directly — no more pasting plaintext. The agent receives the full resume body as a first-class MCP resource.

Example prompt

"Read ophyai://resumes/primary, then call ophyai.resume_analyze with that text and the JD I paste next."

Job hunting from the terminal

Wire the MCP server into your shell agent of choice (Claude Code, Codename, your own). Search jobs, score listings against your resume, and write a tracker entry — all from the terminal.

Example prompt

"Use ophyai.jobs_search for senior backend roles, then ophyai.resume_analyze to score my resume against the top 5. Write a markdown report to ./jobs/today.md."

Setup — wire OphyAI into your MCP client

  1. 1

    Sign up for OphyAI

    Free, no credit card. Get an account at app.ophyai.com — the MCP server uses your account's API key.

  2. 2

    Install the MCP server (v0.1, coming soon)

    Once published, install via npx: `npx -y @ophyai/mcp-server@latest`. Today, you can clone the OpenAPI-driven reference implementation listed in /integrations/mcp-server.md.

  3. 3

    Add to your MCP client config

    In claude_desktop_config.json, .mcp.json, or your client's equivalent, add an entry that runs the server with OPHYAI_API_KEY set to your OphyAI key.

  4. 4

    Restart your client

    Restart Claude Desktop / Claude Code / Cursor. The OphyAI MCP tools (interview_answer, resume_analyze, etc.) should appear in the tool list.

  5. 5

    Test with a simple call

    Ask: "Use the ophyai.resume_analyze tool to score this resume against this JD." If the agent calls the tool and returns a score, you're wired up.

  6. 6

    Build with it

    Compose with other MCP servers — filesystem, git, web search. The OphyAI tools become a normal part of your agent's capability surface.

Available actions

The agent has access to these OphyAI API endpoints. Full OpenAPI spec at /integrations/openapi.yaml.

ActionPurpose
interview_answerReal-time interview answer from question + resume/JD context.
star_answerSTAR-format behavioral story for a target role.
resume_analyzeATS score + missing keywords + prioritized fixes.
resume_tailorRewrite a resume for a specific JD (preserve_truth on by default).
cover_letter_generateTailored cover letter generation.
jobs_searchSearch the OphyAI job index by query, location, seniority, recency.

Pricing & limits

Free with any OphyAI account, including the no-card free tier. Programmatic API usage is metered against your OphyAI credits — Resume Builder calls cost 5 credits, Interview Copilot answers cost 15. The MCP server itself is open and self-hostable; no per-seat fee.

Example: install + call from an MCP client

Copy/paste-ready example.

json + js
// claude_desktop_config.json (or any MCP client config)

{
  "mcpServers": {
    "ophyai": {
      "command": "npx",
      "args": ["-y", "@ophyai/mcp-server@latest"],
      "env": {
        "OPHYAI_API_KEY": "ophy_sk_live_••••"
      }
    }
  }
}

// Once installed and the client is restarted, the agent can call:
//
//   ophyai.interview_answer({
//     question: "Tell me about a time you led through ambiguity.",
//     resume_text: "...",
//     job_description: "...",
//     tone: "star"
//   })
//
//   ophyai.resume_analyze({ resume_text, job_description })
//   ophyai.jobs_search({ query: "senior PM", country: "us", remote: true })
//
// Full tool inventory: /integrations/mcp-server.md
// OpenAPI spec: /integrations/openapi.yaml

Frequently Asked Questions

Is the OphyAI MCP server published yet?

The design doc and tool inventory at /integrations/mcp-server.md are real and stable enough to wire into your own MCP server today. The official @ophyai/mcp-server npm package is on the v0.1 roadmap. In the meantime, builders consume the OpenAPI spec at /integrations/openapi.yaml directly and front it with their own MCP transport.

What is Model Context Protocol?

MCP is an open protocol from Anthropic for connecting LLM agents to tools and data sources. Any MCP-compatible client — Claude Desktop, Claude Code, Cursor, custom agents — can call MCP tools the same way, without bespoke integration code per server. Think of it as USB-C for AI agents.

Which MCP clients will work with the OphyAI server?

Anything that speaks MCP. The first targets are Claude Desktop, Claude Code, Cursor, Continue, and custom Anthropic / OpenAI agent stacks. The v0.1 server will ship as stdio (for desktop installs); v1 adds streamable HTTP transport at mcp.ophyai.com for hosted agents.

How does the MCP server differ from the ChatGPT GPT and the Microsoft Copilot agent?

Same six capabilities (interview answers, STAR stories, resume analysis, resume tailoring, cover letters, job search), three different transports. Use MCP when you're building a custom agent or live inside a Claude/Cursor-style developer tool. Use the ChatGPT GPT inside ChatGPT. Use the Microsoft 365 Copilot agent inside Teams/Word/Outlook.

Will the server expose MCP resources, not just tools?

Yes — the v0.2 roadmap adds MCP resources for OphyAI-stored artifacts: ophyai://resumes/{id}, ophyai://applications/{id}, and ophyai://jobs/{id}. That lets agents fetch the actual resume / JD body without round-tripping through the user.

Can I implement the OphyAI MCP server myself today?

Yes. The OpenAPI spec at /integrations/openapi.yaml gives you exact request/response shapes for every tool. Write a thin MCP server in Python or TypeScript that maps each MCP tool call to the corresponding HTTPS endpoint. We'll happily review and link to community implementations — email api@ophyai.com.

Build with OphyAI MCP

Sign up free, then read /integrations/mcp-server.md for the full tool inventory and roadmap. The v0.1 npm package is coming soon; the OpenAPI spec is consumable today.