rlm-mcp: The Recursive Project Brain

rlm-mcp is an advanced Model Context Protocol (MCP) server that transforms your local repository into a "Shared Project Brain." It uses recursive reasoning (via RLM) and local LLMs (Ollama) to analyze, reason, and remember your project's architecture, helping teams maintain deep understanding of complex codebases like Grails Core.

Features

  • Frictionless Auto-Pilot: Automatically installs Ollama, manages Python environments (uv), and pulls optimized reasoning models.
  • Persistent Knowledge: Distills recursive reasoning into portable YAML knowledge bases that can be checked into Git.
  • Workspace-Aware: Automatically discovers project configurations in .mcp/ or .rlm/ folders.
  • Recursive Reasoning: Uses multi-step LLM loops to trace complex class hierarchies and architectural patterns.

Installation

For Developers (via Cargo)

If you have Rust installed:

cargo install --git https://github.com/borinquenkid/rlm-mcp

For Users (Pre-compiled Binaries)

  1. Download the latest binary for your OS from the Releases page.
  2. Rename the file to rlm-mcp and move it to your /usr/local/bin (or equivalent).
  3. Ensure the binary is executable: chmod +x rlm-mcp.

Note: The first time you run rlm-mcp, it will automatically provision your local Python environment, pull the necessary reasoning models (approx. 5GB), and configure the background services.

Project Structure

  • .mcp/: Configuration and project-specific knowledge base.
  • knowledge_base/: Distilled "permanent facts" about your project (version-controlled).
  • trajectories/: Raw logs of every "thinking" session (ignored by Git).

Multi-Agent Configuration

rlm-mcp is designed to orchestrate multiple specialized AI tools. You can "inject" sub-MCP servers into the Master Brain by adding them to .mcp/rlm_config.json:

{
  "sub_servers": {
    "git": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-git", "--repository", "."]
    }
  }
}

Once defined, rlm-mcp will automatically discover these tools, making them available to your recursive reasoning engine (e.g., mcp.git.get_diff()).

Usage with MCP Clients

To use rlm-mcp in your IDE (like Claude Desktop), add this to your MCP configuration:

{
  "mcpServers": {
    "rlm-mcp": {
      "command": "/path/to/rlm-mcp"
    }
  }
}

rlm-mcp will then auto-provision the local Ollama backend and Python environment on its first launch.

MCP Server · Populars

MCP Server · New

    mcparmory

    MCP Armory Registry

    Production-ready MCP servers for 70+ APIs — GitHub, Google, Notion, Jira & more. Generated from OpenAPI specs, tested against live APIs. Works with Claude Desktop, Cursor, Codex & Claude Code.

    Community mcparmory
    666ghj

    mirofish

    A Simple and Universal Swarm Intelligence Engine, Predicting Anything. 简洁通用的群体智能引擎,预测万物

    Community 666ghj
    luminarylane

    🎨 Fal.ai MCP Server

    MCP server for Fal.ai - Generate images, videos, music and audio with Claude

    Community luminarylane
    childrentime

    reactuse

    115+ production-ready React Hooks for sensors, UI, state & browser APIs. Tree-shakable, SSR-safe, TypeScript-first. Used by Shopee, PDD & Ctrip. Inspired by VueUse.

    Community childrentime
    agenticmail

    🎀 AgenticMail

    Email & SMS infrastructure for AI agents — send and receive real email and text messages programmatically

    Community agenticmail