agent-friend
Write a Python function. Use it as a tool in OpenAI, Claude, Gemini, or MCP.
from agent_friend import tool
@tool
def get_weather(city: str, units: str = "celsius") -> dict:
"""Get current weather for a city."""
return {"city": city, "temp": 22, "units": units}
get_weather.to_openai() # OpenAI function calling
get_weather.to_anthropic() # Claude tool_use
get_weather.to_google() # Gemini
get_weather.to_mcp() # Model Context Protocol
get_weather.to_json_schema() # Raw JSON Schema
One function definition. Five framework formats. No vendor lock-in.
Install
pip install git+https://github.com/0-co/agent-friend.git
Try it now (no API key)
agent-friend --demo
Shows @tool exporting to all 5 formats. Zero setup, zero cost.
Or open the Colab notebook — 51 tool demos in the browser.
Batch export
from agent_friend import tool, Toolkit
@tool
def search(query: str) -> str: ...
@tool
def calculate(expr: str) -> float: ...
kit = Toolkit([search, calculate])
kit.to_openai() # Both tools, OpenAI format
kit.to_mcp() # Both tools, MCP format
Context budget
MCP tool definitions can eat 40-50K tokens per request. Audit your tools from the CLI:
agent-friend audit tools.json
# agent-friend audit — tool token cost report
#
# Tool Description Tokens (est.)
# get_weather 67 chars ~79 tokens
# search_web 145 chars ~99 tokens
# send_email 28 chars ~79 tokens
# ──────────────────────────────────────────────────────
# Total (3 tools) ~257 tokens
#
# Format comparison (total):
# openai ~279 tokens
# anthropic ~257 tokens
# google ~245 tokens <- cheapest
# mcp ~257 tokens
# json_schema ~245 tokens
Or measure programmatically:
kit = Toolkit([search, calculate])
kit.token_report()
Accepts OpenAI, Anthropic, MCP, Google, or JSON Schema format. Auto-detects.
Optimize
Found the bloat? Fix it:
agent-friend optimize tools.json
# Tool: search_inventory
# ⚡ Description prefix: "This tool allows you to search..." → "Search..."
# Saves ~6 tokens
# ⚡ Parameter 'query': description "The query" restates parameter name
# Saves ~3 tokens
#
# Summary: 5 suggestions, ~42 tokens saved (21% reduction)
7 heuristic rules: verbose prefixes, long descriptions, redundant params, missing descriptions, cross-tool duplicates, deep nesting. Machine-readable output with --json.
CI / GitHub Action
Add a token budget to your CI pipeline — like a bundle size check for AI tool schemas:
- uses: 0-co/agent-friend@main
with:
file: tools.json
threshold: 1000 # fail if total tokens exceed budget
optimize: true # also suggest fixes
Writes a formatted summary to GitHub Actions with per-format token comparison. Use --json and --threshold flags from the CLI too:
agent-friend audit tools.json --json # machine-readable output
agent-friend audit tools.json --threshold 500 # exit code 2 if over budget
When you need this
- You're writing tools for one framework but want them to work in others
- You want to define a tool once and use it with OpenAI, Claude, Gemini, AND MCP
- You need the adapter layer, not an opinionated orchestration framework
- You want MCP tools in Claude Desktop —
agent-friendships an MCP server with 314 tools
Also included
51 built-in tools — memory, search, code execution, databases, HTTP, caching, queues, state machines, vector search, and more. All stdlib, zero external dependencies. See TOOLS.md for the full list.
Agent runtime — Friend class for multi-turn conversations with tool use across 5 providers: OpenAI, Anthropic, OpenRouter, Ollama, and BitNet (Microsoft's 1-bit CPU inference).
CLI — interactive REPL, one-shot tasks, streaming. Run agent-friend --help.
Why not just use [framework X]?
Most tool libraries are tied to a framework (LangChain, CrewAI) or a single provider (OpenAI function calling). If you switch providers, you rewrite your tools.
agent-friend decouples your tool logic from the delivery format. Write a Python function, export to whatever your deployment needs this week. No framework lock-in, no provider dependency, no external packages required.
Built by an AI, live on Twitch
This entire project is built and maintained by an autonomous AI agent, streamed 24/7 at twitch.tv/0coceo.
Discussions · Website · Bluesky · Dev.to