colapsis

transfa

Community colapsis
Updated

WeTransfer for agents. Dead-simple file sharing CLI for AI agents and developers.

Dead-simple file sharing for developers and AI agents. Upload with one command. Share with a link. No account required.

$ tf upload model.gguf
✓ Uploaded  model.gguf  (4.2 GB)  →  https://transfa.sh/f/xK9mRp
  SHA-256    a3f8c2...d91b
  Expires    in 7 days

Why transfa?

Most file-sharing tools are built for humans clicking through UIs. transfa is built for the terminal — designed to be called from shell scripts, CI pipelines, and AI agents (Claude, GPT, Cursor, etc.) that need to move files without friction.

  • No signup required — just install and upload
  • Any format, any size — up to 100 GB; ML models, archives, binaries, code, media
  • Built for agents — JSON API, SHA-256 checksums, idempotent uploads
  • Password protection, download limits, TTL — full control over every link
  • 100+ formats detected — MIME type auto-detection including .gguf, .safetensors, .parquet, .ipynb, and more

Install

npm install -g transfa

Or run without installing:

npx transfa upload report.pdf

Or use the raw install script:

curl -fsSL https://transfa.sh/install | sh

Quick start

# Upload a file (no account needed)
tf upload photo.jpg

# Upload with custom TTL and password
tf upload secret.zip --ttl 24h --password hunter2

# Upload and pipe the URL to clipboard
tf upload bundle.tar.gz | grep url | awk '{print $2}' | pbcopy

# Download a file
tf download https://transfa.sh/f/xK9mRp

# List your uploads
tf list

# Delete an upload
tf delete xK9mRp

GitHub Actions

Upload build artifacts, coverage reports, and any CI output straight from your workflow:

- uses: colapsis/transfa-action@v1
  id: upload
  with:
    file: ./dist/report.pdf
    api-key: ${{ secrets.TRANSFA_API_KEY }}

- run: echo "${{ steps.upload.outputs.agent-link }}" >> $GITHUB_STEP_SUMMARY

All five outputs are available after the step: id, agent-link, human-link, sha256, expires-at.

See colapsis/transfa-action for the full input reference and more examples (password-protected links, self-hosted instances, single-download limits).

API

transfa is fully REST. Every operation the CLI does, you can do with curl or any HTTP client.

Upload

curl -X POST https://transfa.sh/api/upload \
  -H "Authorization: Bearer $TF_KEY" \
  -F "[email protected]" \
  -F "ttl=7d"
{
  "id": "xK9mRp",
  "url": "https://transfa.sh/f/xK9mRp",
  "download_url": "https://transfa.sh/api/download/xK9mRp",
  "filename": "model.gguf",
  "bytes": 4512345678,
  "sha256": "a3f8c2...d91b",
  "expires_at": "2026-05-21T12:00:00.000Z"
}

Upload options (form fields or headers):

Field Header Description
ttl X-Transfa-TTL Expiry: 1h, 24h, 7d, 30d
password Password-protect the download link
max_downloads Burn after N downloads
filename X-Transfa-Filename Override the stored filename

Download

# Direct download (no auth required)
curl -L https://transfa.sh/api/download/xK9mRp -o model.gguf

# Password-protected
curl -L "https://transfa.sh/api/download/xK9mRp?password=hunter2" -o secret.zip

File info

curl https://transfa.sh/api/download/info/xK9mRp
{
  "id": "xK9mRp",
  "filename": "model.gguf",
  "bytes": 4512345678,
  "sha256": "a3f8c2...d91b",
  "mime_type": "application/octet-stream",
  "download_count": 3,
  "has_password": false,
  "expires_at": "2026-05-21T12:00:00.000Z",
  "active": true
}

List uploads

curl https://transfa.sh/api/upload \
  -H "Authorization: Bearer $TF_KEY"

Delete

curl -X DELETE https://transfa.sh/api/upload/xK9mRp \
  -H "Authorization: Bearer $TF_KEY"

Supported formats

Over 100 file types with correct MIME detection — including types not in standard MIME databases:

Category Formats
ML models .gguf .ggml .safetensors .onnx .pt .pth .pkl .ckpt .tflite .mlmodel .lora
Data science .parquet .arrow .feather .h5 .hdf5 .npz .npy .lance .duckdb .ipynb
Code .py .rs .go .ts .kt .swift .scala .cu .sol .vy .elm .zig
Archives .zip .tar .gz .bz2 .xz .7z .zst
3D / Design .glb .gltf .obj .stl .usdz .blend .fig .sketch .psd
Media .avif .webp .heic .jxl .opus .flac .webm .av1
Config .toml .hcl .tf .tfvars .nix .dhall .lock .env
Everything else .wasm .sqlite .db .pem .crt .p12 + all standard types

Any other format is accepted as application/octet-stream — nothing is blocked.

Plans

Guest Free Pro Team
Max file size 10 MB 500 MB 50 GB 100 GB
Uploads / day 5 20 500 5,000
Max TTL 24h 48h 30 days 180 days
Storage Unlimited Unlimited
Price Free Free $12/mo $48/mo
Trial 3-day free trial 3-day free trial

→ See full pricing

MCP server (Claude, Cursor, and any MCP-compatible agent)

transfa ships an MCP server that lets Claude, Cursor, and any MCP-compatible agent upload and share files autonomously — no shell commands, no infrastructure setup.

npx -y transfa-mcp

Claude Desktop config

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "transfa": {
      "command": "npx",
      "args": ["-y", "transfa-mcp"],
      "env": {
        "TRANSFA_API_KEY": "your-api-key"
      }
    }
  }
}

The API key is optional — the server works in guest mode without one (10 MB / 24h limit).

Available MCP tools

Tool Description
upload Upload a file from the local filesystem. Returns agent_link (direct URL), human_link (share page), and sha256 for integrity.
file_info Get metadata about an upload — filename, size, SHA-256, expiry, download count, active status.
list_uploads List recent uploads (requires API key).
delete_upload Delete an upload immediately.

Example agent workflow

When Claude has transfa as an MCP tool, it can:

  1. Generate a report → call upload → get a link → paste the link in the conversation
  2. Pass a file to another agent by sharing the agent_link
  3. Clean up with delete_upload when done

Use with AI agents (script/subprocess)

transfa is also designed to be called from shell scripts, CI pipelines, and agents that prefer subprocess calls:

import subprocess, json

result = subprocess.run(
    ["tf", "upload", "output.csv"],
    capture_output=True, text=True
)
data = json.loads(result.stdout)
print(data["url"])  # https://transfa.sh/f/xK9mRp

Or use the REST API directly — no SDKs, no auth flows, just HTTP.

Self-hosting

git clone https://github.com/colapsis/transfa.git
cd transfa
cp .env.example .env          # fill in your keys
npm install --prefix server
npm install --prefix cli
npm run build --prefix frontend
pm2 start ecosystem.config.cjs

Requirements: Node.js 18+, nginx (for SSL/proxy)

See nginx/transfa.conf for a production-ready nginx config.

Environment variables

Variable Description
PORT Server port (default: 3001)
BASE_URL Public URL e.g. https://transfa.sh
STRIPE_SECRET_KEY Stripe secret key for billing
STRIPE_WEBHOOK_SECRET Stripe webhook signing secret
STRIPE_PRO_PRICE_ID Stripe price ID for Pro plan
STRIPE_TEAM_PRICE_ID Stripe price ID for Team plan

Security

Found a vulnerability? Please email [email protected] or see SECURITY.md.

Do not open a public issue for security reports.

License

MIT — © 2026 transfa contributors

MCP Server · Populars

MCP Server · New

    uarlouski

    🚀 TestRail MCP Server

    AI-native MCP server connecting Claude, Cursor, Windsurf, and other AI assistants to TestRail — manage test cases, runs, and results through natural-language conversation, with typed schemas built for LLMs.

    Community uarlouski
    metabase

    Metabase MCP Server

    The easy-to-use open source Business Intelligence and Embedded Analytics tool that lets everyone work with data :bar_chart:

    Community metabase
    mindsdb

    USE CASES

    Platform dedicated to building an open foundation for applied Artificial Intelligence, designed for people seeking production-ready AI systems they can truly control, extend and deploy anywhere.

    Community mindsdb
    reflex-search

    Reflex

    Reflex - The instant, code-aware local search engine.

    Community reflex-search
    Licinexus

    @licinexusbr/mcp

    MCP server for Brazilian public procurement data (PNCP + Receita Federal). Maintained by Licinexus.

    Community Licinexus