A few days ago I shipped receiptconverter-mcp — an MCP server that wraps the ReceiptConverter API so Claude, Cursor, and Windsurf can call it as a native tool.
The total code: about 150 lines. The total time: one afternoon. Here's exactly how I built it.
What MCP actually is
Before I get into the build, a quick explanation for anyone who hasn't encountered it yet.
Model Context Protocol is an open standard by Anthropic that lets AI clients (Claude, Cursor, Windsurf, etc.) call tools from external servers. The server defines tools with names and JSON Schema parameters. The client discovers and calls them over a stdio pipe.
From the developer's perspective, you're essentially writing a small Node.js (or Python) process that:
- Declares a list of tools (name, description, input schema)
- Handles tool call requests
- Returns results
That's it. The AI client handles everything else.
The implementation
I started with the @modelcontextprotocol/sdk package. The SDK handles the wire protocol — you just implement two request handlers.
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { ListToolsRequestSchema, CallToolRequestSchema } from "@modelcontextprotocol/sdk/types.js";
const server = new Server(
{ name: "receiptconverter", version: "1.0.0" },
{ capabilities: { tools: {} } }
);
// 1. Declare your tools
server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: [{
name: "convert_receipt",
description: "Parse a receipt or invoice into structured JSON...",
inputSchema: {
type: "object",
properties: {
url: { type: "string", description: "Public URL of receipt" },
file_path: { type: "string", description: "Absolute local file path" },
},
},
}],
}));
// 2. Handle tool calls
server.setRequestHandler(CallToolRequestSchema, async ({ params }) => {
const { name, arguments: args } = params;
// ... call your API, return results
});
// 3. Connect over stdio
await server.connect(new StdioServerTransport());
The tricky part was supporting both URL mode and local file upload. For URL mode, I send JSON. For file paths, I read the file with fs.readFileSync, wrap it in a File object, and send it as FormData. Node 18+ has both File and FormData built in, so no extra dependencies needed.
What I decided not to include
I kept it to two tools: convert_receipt and check_usage. I deliberately didn't add:
- Listing saved receipts — that requires dashboard auth, not just an API key
- Webhooks or streaming — the API is synchronous, keeping it simple
- Caching — the MCP server is stateless by design
The goal was a package so simple anyone could audit it in 5 minutes.
Publishing to npm
cd mcp/
npm publish --access public
One gotcha: npm requires 2FA to be bypassed for automation tokens. You need a Classic Automation token, not a regular auth token. Regular tokens and even granular tokens don't bypass the 2FA requirement unless you explicitly enable "Bypass 2FA" when creating them.
The package.json setup
A few things that matter for a publishable MCP server:
{
"type": "module",
"bin": { "receiptconverter-mcp": "./index.js" },
"files": ["index.js", "README.md"],
"engines": { "node": ">=18" }
}
"type": "module" — necessary for ES module imports with the SDK.
"bin" — this is what makes npx receiptconverter-mcp work. Without it, npx wouldn't know what to run.
"files" — keeps the package small. Node modules (node_modules/) are not included because users run via npx, which installs dependencies fresh.
Distribution
Once published to npm, you can submit to MCP directories:
- mcpmarket.com — submit your GitHub repo URL
- glama.ai/mcp/servers — growing fast
- aiagentslist.com — broader AI tools directory
These directories list MCP servers the same way npm lists packages. You show up alongside Stripe, GitHub, Notion — tools developers already use in their AI setups.
What I'd do differently
Write a test first. I ended up testing by running the server manually with echo piped MCP messages, which works but is tedious. A proper test harness using the SDK's in-memory transport would have been faster.
Add a --version flag. Some clients display version info in their MCP panels. Easy to add, I just forgot.
Create the GitHub repo first. I built in a subdirectory of my main private repo, then had to extract it into a separate public repo for the MCP directories. Starting with a public repo from the beginning would have saved a step.
The full code is at github.com/cheatbased/receiptconverter-mcp. If you're building a SaaS API and wondering whether to ship an MCP server — it's worth the afternoon.
Related: MCP Quick Start · MCP / AI Agents guide · Why every SaaS should ship an MCP server