Add your description here

MCP Server: Elasticsearch semantic search tool

Demo repo for: https://j.blaszyk.me/tech-blog/mcp-server-elasticsearch-semantic-search/

Table of Contents

  • Overview
  • Running the MCP Server
  • Integrating with Claude Desktop
  • Crawling Search Labs Blog Posts
    • 1. Verify Crawler Setup
    • 2. Configure Elasticsearch
    • 3. Update Index Mapping for Semantic Search
    • 4. Start Crawling
    • 5. Verify Indexed Documents

Overview

This repository provides a Python implementation of an MCP server for semantic search through Search Labs blog posts indexed in Elasticsearch.

It assumes you've crawled the blog posts and stored them in the search-labs-posts index using Elastic Open Crawler.

Running the MCP Server

Add ES_URL and ES_AP_KEY into .env file, (take a look here for generating api key with minimum permissions)

Start the server in MCP Inspector:

make dev

Once running, access the MCP Inspector at: http://localhost:5173

Integrating with Claude Desktop

To add the MCP server to Claude Desktop:

make install-claude-config

This updates claude_desktop_config.json in your home directory. On the next restart, the Claude app will detect the server and load the declared tool.

Crawling Search Labs Blog Posts

1. Verify Crawler Setup

To check if the Elastic Open Crawler works, run:

docker run --rm \
  --entrypoint /bin/bash \
  -v "$(pwd)/crawler-config:/app/config" \
  --network host \
  docker.elastic.co/integrations/crawler:latest \
  -c "bin/crawler crawl config/test-crawler.yml"

This should print crawled content from a single page.

2. Configure Elasticsearch

Set up Elasticsearch URL and API Key.

Generate an API key with minimum crawler permissions:

POST /_security/api_key
{
  "name": "crawler-search-labs",
  "role_descriptors": {
    "crawler-search-labs-role": {
      "cluster": ["monitor"],
      "indices": [
        {
          "names": ["search-labs-posts"],
          "privileges": ["all"]
        }
      ]
    }
  },
  "metadata": {
    "application": "crawler"
  }
}

Copy the encoded value from the response and set it as API_KEY.

3. Update Index Mapping for Semantic Search

Ensure the search-labs-posts index exists. If not, create it:

PUT search-labs-posts

Update the mapping to enable semantic search:

PUT search-labs-posts/_mappings
{
  "properties": {
    "body": {
      "type": "text",
      "copy_to": "semantic_body"
    },
    "semantic_body": {
      "type": "semantic_text",
      "inference_id": ".elser-2-elasticsearch"
    }
  }
}

The body field is indexed as semantic text using Elasticsearch’s ELSER model.

4. Start Crawling

Run the crawler to populate the index:

docker run --rm \
  --entrypoint /bin/bash \
  -v "$(pwd)/crawler-config:/app/config" \
  --network host \
  docker.elastic.co/integrations/crawler:latest \
  -c "bin/crawler crawl config/elastic-search-labs-crawler.yml"

[!TIP]If using a fresh Elasticsearch cluster, wait for the ELSER model to start before indexing.

5. Verify Indexed Documents

Check if the documents were indexed:

GET search-labs-posts/_count

This will return the total document count in the index. You can also verify in Kibana.

Done! You can now perform semantic searches on Search Labs blog posts

MCP Server · Populars

MCP Server · New

    82ch

    MCP-Dandan - MCP Security Framework

    MCP Security Solution for Agentic AI — real-time proxying, behavior analysis, and malicious tool detection

    Community 82ch
    Vvkmnn

    claude-historian-mcp

    🤖 An MCP server for Claude Code conversation history

    Community Vvkmnn
    tommyreid622

    Polymarket Copy Trading Bot

    Polymarket trading bot: Polymarket copytrading bot, Polymarket arbitrage bot on Polymarket, Monitor real price on Polymarket and calculate prob and automatically mirror positions with intelligent sizing and safety checks on Polymarket.(copytrading bot & arbitrage bot))

    Community tommyreid622
    aws

    MCP Proxy for AWS

    AWS MCP Proxy Server

    Community aws
    railsblueprint

    Blueprint MCP

    MCP server for browser automation across Chrome, Firefox, and Safari using real browser profiles

    Community railsblueprint