Invompt

invompt-mcp

Community Invompt
Updated

MCP server for Invompt — create invoices from any AI tool

invompt-mcp

npmLicense: MITNode.js

MCP server for Invompt — the AI invoice generator.

Describe what you need, get a professional invoice. Connects your AI assistant to the Invompt API so you can create, format, and share invoices without leaving your editor.

Get started →

What it does

Component Description
Resource invompt://spec/iml/v1 Invoice format specification — the LLM reads this to learn how to structure invoices
Tool create_invoice Creates an invoice and returns a shareable URL
Prompt draft_invoice_iml Optional template that helps the LLM draft invoice data from natural language

Get your API key

  1. Sign up at invompt.com
  2. Go to IntegrationsGenerate API Key
  3. Copy the key (starts with inv_sk_) — it's shown once

Setup

Claude Code

claude mcp add invompt -e INVOMPT_API_KEY=inv_sk_... -- npx invompt-mcp

Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "invompt": {
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

Restart Claude Desktop after saving.

Cursor

Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):

{
  "mcpServers": {
    "invompt": {
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

VS Code (Copilot)

Add to .vscode/mcp.json:

{
  "servers": {
    "invompt": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

VS Code uses "servers", not "mcpServers".

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "invompt": {
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

Cline

Open Cline sidebar → MCP Servers → Configure, then add:

{
  "mcpServers": {
    "invompt": {
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

Amazon Q Developer

Add to ~/.aws/amazonq/mcp.json:

{
  "mcpServers": {
    "invompt": {
      "command": "npx",
      "args": ["-y", "invompt-mcp"],
      "env": {
        "INVOMPT_API_KEY": "inv_sk_..."
      }
    }
  }
}

Codex (OpenAI)

Add to ~/.codex/config.toml:

[mcp_servers.invompt]
command = "npx"
args = ["-y", "invompt-mcp"]

[mcp_servers.invompt.env]
INVOMPT_API_KEY = "inv_sk_..."

Bun users

Replace npx with bunx and remove -y in any configuration above:

"command": "bunx",
"args": ["invompt-mcp"],

Usage

Once connected, ask your AI assistant:

"Create an invoice for 40 hours of web development at $150/hr for Acme Corp, due in 30 days"

The assistant reads the format spec, generates the invoice data, calls the API, and returns a link to your invoice.

How it works

You describe the invoice in plain English
  → LLM reads invompt://spec/iml/v1 to learn the format
  → LLM generates invoice data as YAML
  → LLM calls create_invoice
  → Invompt validates, renders, and stores the invoice
  → You get a shareable URL

Templates

Pass templateId to choose a look:

Template Description
professional Clean business layout (default)
minimal Simple, compact
modern Contemporary design

Environment variables

Variable Required Description
INVOMPT_API_KEY For create_invoice Your API key from invompt.com/integrations

The resource and prompt work without an API key — only the tool needs one.

Development

npm install       # install dependencies
npm run build     # compile TypeScript
npm test          # run tests
npm run dev       # run from source

License

MIT — see LICENSE

MCP Server · Populars

MCP Server · New

    ogham-mcp

    Ogham MCP

    Shared memory MCP server — persistent, searchable, cross-client

    Community ogham-mcp
    rocketride-org

    rocketride-server

    High-performance AI pipeline engine with a C++ core and 50+ Python-extensible nodes. Build, debug, and scale LLM workflows with 13+ model providers, 8+ vector databases, and agent orchestration, all from your IDE. Includes VS Code extension, TypeScript/Python SDKs, and Docker deployment.

    Community rocketride-org
    nteract

    semiotic

    A data visualization for AI and Streaming

    Community nteract
    louislva

    claude-peers

    Allow all your Claude Codes to message each other ad-hoc!

    Community louislva
    rixinhahaha

    Snip

    A macOS menu-bar screenshot tool with annotation, AI-powered organization, and semantic search. Built with Electron and Ollama. Featured on Product Hunt: https://www.producthunt.com/products/snip-ai-powered-macos-screenshot-tool

    Community rixinhahaha