MiniMe-MCP v0.2.0

๐Ÿš€ The IDE Agent upgrade that creates your developer digital twinโ€”across all your projects.

Everyone's doing it nowโ€”vibe coding with AI assistants. That magical flow where you're thinking out loud, the AI gets it, and code just happens. But here's the problem: your AI has amnesia. Every conversation starts from zero. Every project feels like explaining yourself to a stranger.

What if your AI could actually remember?

๐Ÿง  MiniMe-MCP: Your AI's Persistent Brain

Context Engineering for the Intelligence Age

This isn't just another MCP server. This is your AI assistant's digital hippocampusโ€”storing, connecting, and evolving with every interaction. While others ship features, you'll be shipping intelligence.

The Problem with Vibe Coding Today:

  • ๐Ÿ”„ Endless repetition โ€” "Here's how we handle auth..." (for the 50th time)
  • ๐Ÿคท Context amnesia โ€” AI forgets your patterns, preferences, and decisions
  • ๐Ÿ๏ธ Project islands โ€” Learning from one project never benefits another
  • ๐Ÿ“š Knowledge leakage โ€” Insights evaporate between sessions

The MiniMe-MCP Solution:

// Instead of this painful cycle:
"Hey AI, remember we use PostgreSQL with..."
"Oh, and we prefer functional components..."
"Also, we decided against Redis because..."

// You get this magical experience:
AI: "Based on your auth patterns from Project A and the scalability 
     lessons from Project B, here's how I'd approach this..."

๐Ÿ”ฎ How Intelligence-First Development Works

1. Memory That Actually Matters

Your AI doesn't just rememberโ€”it understands context:

  • Decisions & Rationale: Why you chose React over Vue (and when that changes)
  • Code Patterns: Your team's conventions that make review faster
  • Architecture Evolution: How your system design thinking has matured
  • Bug Solutions: That tricky CORS fix from 6 months ago

2. Cross-Project Pattern Recognition

The real magic happens when your AI connects dots across projects:

๐Ÿ’ก "I notice you're implementing JWT auth again. In your last 3 projects, 
   you always hit the same refresh token edge case around day 3. 
   Want me to handle that proactively this time?"

๐Ÿ’ก "Your database connection patterns from ProjectA would solve the 
   performance issue you're seeing here. Should I adapt that approach?"

๐Ÿ’ก "Based on your deployment history, this looks like the same nginx 
   config issue that blocked ProjectC. Here's the fix that worked..."

3. Context Engineering in Action

Watch your AI assistant evolve from generic helper to project-native intelligence:

Session 1: Basic assistance Session 10: Knows your preferences Session 100: Predicts your needs Session 1000: Thinks like your team

๐ŸŽฏ Real Benefits, Real Fast

For Solo Developers:

  • Instant Context Switching โ€” Jump between projects without losing momentum
  • Personal Documentation โ€” Your AI becomes your external brain
  • Pattern Evolution โ€” Improve your architecture thinking over time

For Teams:

  • Shared Intelligence โ€” New team members inherit collective wisdom
  • Consistent Patterns โ€” AI enforces team conventions automatically
  • Decision History โ€” Never wonder "why did we build it this way?"

For Organizations:

  • Cross-Team Learning โ€” Best practices spread naturally
  • Knowledge Retention โ€” Insights survive team changes
  • Intelligent Onboarding โ€” New hires get context-aware assistance

๐Ÿ› ๏ธ Universal IDE Intelligence

Works seamlessly across the entire development ecosystem:

๐ŸŽจ VS Code โ†’ Enhanced Copilot with persistent memory ๐Ÿš€ Cursor โ†’ AI pair programming that actually remembers โšก Claude Desktop โ†’ Conversations that build on each other ๐ŸŒŠ Windsurf โ†’ Collaborative coding with shared context ๐Ÿ”— Any MCP Client โ†’ Future-proof intelligence layer

โšก The Intelligence Advantage

# Traditional AI Development
You: "How should I structure this API?"
AI: "Here are some general patterns..."
Result: Generic advice, repeated research

# Intelligence-First Development  
You: "How should I structure this API?"
AI: "Based on your 3 previous APIs, scaling issues you hit with 
     ServiceX, and the clean architecture you loved in ProjectY, 
     here's an approach that fits your patterns..."
Result: Personalized, battle-tested guidance

๐Ÿš€ Ready to Upgrade Your Vibe?

Stop explaining yourself to your AI. Start building with an assistant that gets it.

MiniMe-MCP transforms every IDE session from:

  • โŒ "Let me explain our setup again..."
  • โœ… "You know what I'm trying to do. Let's build."

The future of development isn't just AI-assistedโ€”it's intelligence-amplified. Your code. Your patterns. Your decisions. Remembered. Connected. Evolved.

Welcome to vibe coding with a brain. ๐Ÿง 

Built for the Model Context Protocol. Compatible with VS Code, Cursor, Claude Desktop, Windsurf, and the expanding universe of AI-powered development tools.

โœจ Why MiniMe-MCP?

  • Persistent Context: Your AI assistant remembers everything - decisions, code patterns, project knowledge
  • Intelligent Analysis: AI-powered insights that identify patterns and learning opportunities
  • Universal IDE Support: Works seamlessly with VS Code, Claude Desktop, Cursor, Windsurf
  • Privacy-First: Runs locally with your own Ollama models - your data never leaves your machine
  • Multi-Architecture: Native support for both Intel/AMD (x64) and Apple Silicon (ARM64)

๐Ÿš€ Quick Start (5 minutes)

Prerequisites

  1. Install Docker - Get Docker

  2. Install Ollama - Required for AI models

    # macOS
    brew install ollama
    
    # Linux
    curl -fsSL https://ollama.ai/install.sh | sh
    
    # Windows
    # Download from https://ollama.ai/download
    
  3. Pull Required Models

    # Pull the embedding model (REQUIRED)
    ollama pull mxbai-embed-large
    
    # Pull the default LLM model
    ollama pull deepseek-coder:6.7b
    

Run MiniMe-MCP

# Pull and run the Docker image (auto-selects ARM64 or AMD64)
docker run -d \
  --name minimemcp \
  --restart unless-stopped \
  -p 5432:5432 \
  -p 8000:8000 \
  -p 9090:9090 \
  -v minime-mcp-v9:/data \
  -e POSTGRES_PASSWORD=minime_password \
  -e UI_PORT=9090 \
  manujbawa/minimemcp:latest

That's it! MiniMe-MCP is now running:

๐Ÿ› ๏ธ MCP Tools for Your IDE

Install the MCP Client

npm install -g @minimemcp/mcp-client

Available MCP Tools

  • store_memory - Intelligent memory storage with auto-tagging
  • search_memories - Hybrid semantic/keyword search
  • get_insights - AI-powered pattern analysis
  • start_thinking - Structured reasoning sequences
  • manage_tasks - Project task management
  • manage_project - Documentation and project management

๐ŸŽฏ IDE Integration

Configure your IDE to use MiniMe-MCP tools:

  • Claude Desktop - Full MCP support
  • Cursor - Full MCP support
  • VS Code - Supported with Copilot
  • Windsurf - Full MCP support

Once configured, your AI assistant will have access to persistent memory and intelligent tools directly in your IDE.

๐Ÿง  Key Features

Intelligence-First Framework

  • Mandatory Session Startup: Automatic project context loading
  • Aggressive Memory Storage: Everything important is stored automatically
  • Pattern Recognition: AI identifies trends and learning opportunities
  • Structured Thinking: Multi-step reasoning for complex problems

Advanced Capabilities

  • Unified Insights v2: Pattern detection with LLM-powered categorization
  • Sequential Thinking: Branch and explore multiple solution paths
  • Project Intelligence: Learns your codebase structure and conventions
  • Task Management: Integrated task tracking with intelligent prioritization

๐Ÿ”ง Advanced Configuration

Use Different LLM Models

docker run -d \
  --name minimemcp \
  -e LLM_MODEL="llama2:13b" \
  -e POSTGRES_PASSWORD=minime_password \
  -e UI_PORT=9090 \
  -p 5432:5432 -p 8000:8000 -p 9090:9090 \
  -v minime-mcp-v9:/data \
  manujbawa/minimemcp:latest

Custom Ports

docker run -d \
  --name minimemcp \
  -e MCP_PORT="8080" \
  -e UI_PORT="9090" \
  -e POSTGRES_PASSWORD=minime_password \
  -p 5432:5432 -p 8080:8080 -p 9090:9090 \
  -v minime-mcp-v9:/data \
  manujbawa/minimemcp:latest

๐Ÿ“ฆ Building from Source

For development or customization:

# Clone the repository
git clone https://github.com/yourusername/MiniMe-MCP
cd MiniMe-MCP

# Quick start with everything
make all

# Development mode with hot reload
make dev-hot

# Build for production
make build-fast-v2

๐Ÿ› Troubleshooting

Check Status

# Container status
docker ps -f name=minimemcp

# View logs
docker logs minimemcp -f

# Test health
curl http://localhost:8000/health

Common Issues

Ollama Connection

  • Ensure Ollama is running: ollama serve
  • Verify models are downloaded: ollama list

Memory Processing

  • Check embedding model: ollama pull mxbai-embed-large
  • View logs: docker logs minimemcp | grep embed

๐Ÿ“š Documentation

  • Installation Guide - Detailed setup instructions
  • MCP Configuration - IDE integration guides
  • API Documentation - Available when running

๐Ÿค Contributing

  1. Fork the repository
  2. Create feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open Pull Request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

Built with Intelligence-First principles for maximum AI productivity ๐Ÿš€

Technology Stack

  • Database: PostgreSQL with pgvector for embeddings
  • AI Models: Local Ollama (mxbai-embed-large, deepseek-coder:6.7b)
  • Frontend: React with Material-UI
  • Backend: Node.js with Express
  • Container: Single Docker container with multi-arch support (AMD64 + ARM64)

MCP Server ยท Populars

MCP Server ยท New