spences10

mcp-jinaai-grounding

Community spences10
Updated

MCP server for JinaAI grounding

mcp-jinaai-grounding

A Model Context Protocol (MCP) server for integrating Jina.ai'sGrounding API with LLMs. This server provides efficient andcomprehensive web content grounding capabilities, optimized forenhancing LLM responses with factual, real-time web content.

Features

  • ๐ŸŒ Advanced web content grounding through Jina.ai Grounding API
  • ๐Ÿš€ Real-time content verification and fact-checking
  • ๐Ÿ“š Comprehensive web content analysis
  • ๐Ÿ”„ Clean format optimized for LLMs
  • ๐ŸŽฏ Precise content relevance scoring
  • ๐Ÿ—๏ธ Built on the Model Context Protocol

Configuration

This server requires configuration through your MCP client. Here areexamples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{
	"mcpServers": {
		"jinaai-grounding": {
			"command": "node",
			"args": ["-y", "mcp-jinaai-grounding"],
			"env": {
				"JINAAI_API_KEY": "your-jinaai-api-key"
			}
		}
	}
}

Claude Desktop with WSL Configuration

For WSL environments, add this to your Claude Desktop configuration:

{
	"mcpServers": {
		"jinaai-grounding": {
			"command": "wsl.exe",
			"args": [
				"bash",
				"-c",
				"JINAAI_API_KEY=your-jinaai-api-key npx mcp-jinaai-grounding"
			]
		}
	}
}

Environment Variables

The server requires the following environment variable:

  • JINAAI_API_KEY: Your Jina.ai API key (required)

API

The server implements MCP tools for grounding LLM responses with webcontent:

ground_content

Ground LLM responses with real-time web content using Jina.aiGrounding.

Parameters:

  • query (string, required): The text to ground with web content
  • no_cache (boolean, optional): Bypass cache for fresh results.Defaults to false
  • format (string, optional): Response format ("json" or "text").Defaults to "text"
  • token_budget (number, optional): Maximum number of tokens for thisrequest
  • browser_locale (string, optional): Browser locale for renderingcontent
  • stream (boolean, optional): Enable stream mode for large pages.Defaults to false
  • gather_links (boolean, optional): Gather all links at the end ofresponse. Defaults to false
  • gather_images (boolean, optional): Gather all images at the end ofresponse. Defaults to false
  • image_caption (boolean, optional): Caption images in the content.Defaults to false
  • enable_iframe (boolean, optional): Extract content from iframes.Defaults to false
  • enable_shadow_dom (boolean, optional): Extract content from shadowDOM. Defaults to false
  • resolve_redirects (boolean, optional): Follow redirect chains tofinal URL. Defaults to true

Development

Setup

  1. Clone the repository
  2. Install dependencies:
pnpm install
  1. Build the project:
pnpm run build
  1. Run in development mode:
pnpm run dev

Publishing

  1. Update version in package.json
  2. Build the project:
pnpm run build
  1. Publish to npm:
pnpm run release

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see the LICENSE file for details.

Acknowledgments

MCP Server ยท Populars

MCP Server ยท New

    chatmcp

    mcpso

    directory for Awesome MCP Servers

    Community chatmcp
    TBXark

    MCP Proxy Server

    An MCP proxy server that aggregates and serves multiple MCP resource servers through a single HTTP server.

    Community TBXark
    ttommyth

    interactive-mcp

    Ask users questions from your LLM! interactive-mcp: Local, cross-platform MCP server for interactive prompts, chat & notifications.

    Community ttommyth
    lpigeon

    ros-mcp-server

    The ROS MCP Server is designed to support robots in performing complex tasks and adapting effectively to various environments by providing a set of functions that transform natural language commands, entered by a user through an LLM, into ROS commands for robot control.

    Community lpigeon
    emicklei

    melrose-mcp

    interactive programming of melodies, producing MIDI

    Community emicklei