invompt-mcp
MCP server for Invompt — the AI invoice generator.
Describe what you need, get a professional invoice. Connects your AI assistant to the Invompt API so you can create, format, and share invoices without leaving your editor.
What it does
| Component | Description |
|---|---|
Resource invompt://spec/iml/v1 |
Invoice format specification — the LLM reads this to learn how to structure invoices |
Tool create_invoice |
Creates an invoice and returns a shareable URL |
Prompt draft_invoice_iml |
Optional template that helps the LLM draft invoice data from natural language |
Get your API key
- Sign up at invompt.com
- Go to Integrations → Generate API Key
- Copy the key (starts with
inv_sk_) — it's shown once
Setup
Claude Code
claude mcp add invompt -e INVOMPT_API_KEY=inv_sk_... -- npx invompt-mcp
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"invompt": {
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
Restart Claude Desktop after saving.
Cursor
Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):
{
"mcpServers": {
"invompt": {
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
VS Code (Copilot)
Add to .vscode/mcp.json:
{
"servers": {
"invompt": {
"type": "stdio",
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
VS Code uses
"servers", not"mcpServers".
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"invompt": {
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
Cline
Open Cline sidebar → MCP Servers → Configure, then add:
{
"mcpServers": {
"invompt": {
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
Amazon Q Developer
Add to ~/.aws/amazonq/mcp.json:
{
"mcpServers": {
"invompt": {
"command": "npx",
"args": ["-y", "invompt-mcp"],
"env": {
"INVOMPT_API_KEY": "inv_sk_..."
}
}
}
}
Codex (OpenAI)
Add to ~/.codex/config.toml:
[mcp_servers.invompt]
command = "npx"
args = ["-y", "invompt-mcp"]
[mcp_servers.invompt.env]
INVOMPT_API_KEY = "inv_sk_..."
Bun users
Replace npx with bunx and remove -y in any configuration above:
"command": "bunx",
"args": ["invompt-mcp"],
Usage
Once connected, ask your AI assistant:
"Create an invoice for 40 hours of web development at $150/hr for Acme Corp, due in 30 days"
The assistant reads the format spec, generates the invoice data, calls the API, and returns a link to your invoice.
How it works
You describe the invoice in plain English
→ LLM reads invompt://spec/iml/v1 to learn the format
→ LLM generates invoice data as YAML
→ LLM calls create_invoice
→ Invompt validates, renders, and stores the invoice
→ You get a shareable URL
Templates
Pass templateId to choose a look:
| Template | Description |
|---|---|
professional |
Clean business layout (default) |
minimal |
Simple, compact |
modern |
Contemporary design |
Environment variables
| Variable | Required | Description |
|---|---|---|
INVOMPT_API_KEY |
For create_invoice |
Your API key from invompt.com/integrations |
The resource and prompt work without an API key — only the tool needs one.
Development
npm install # install dependencies
npm run build # compile TypeScript
npm test # run tests
npm run dev # run from source
License
MIT — see LICENSE