RocketRide is a high-performance data processing engine built on a C++ core with a Python-extensible node system. With 50+ pipeline nodes, native AI/ML support, and SDKs for TypeScript, Python, and MCP, it lets you process, transform, and analyze data at scale — entirely on your own infrastructure.
Key Capabilities
- Stay in your IDE — Build, debug, test, and scale heavy AI and data workloads with an intuitive visual builder in the environment you're used to. Stop using your browser.
- High-performance C++ engine — Native multithreading. No bottleneck. Purpose-built for throughput, not prototypes.
- Multi-agent workflows — Orchestrate and scale agents with built-in support for CrewAI and LangChain.
- 50+ pipeline nodes — Python-extensible, with 13 LLM providers, 8 vector databases, OCR, NER, PII anonymization, and more.
- TypeScript, Python & MCP SDKs — Integrate pipelines into native applications or expose them as tools for AI assistants.
- One-click deploy — Run on Docker, on-prem, or RocketRide Cloud (👀coming soon). Our architecture is made for production, not demos.
⚡ Quick Start
Install the extension for your IDE. Search for RocketRide in the extension marketplace:
Click the RocketRide (🚀) extension in your IDE
Deploy a server — you'll be prompted on how you want to run the server. Choose the option that fits your setup:
- Local (Recommended) — This pulls the server directly into your IDE without any additional setup.
- On-Premises — Run the server on your own hardware for full control and data residency. Pull the image and deploy to Docker or clone this repo and build from source.
- RocketRide Cloud (👀coming soon) — Managed hosting with our proprietary model server. No infrastructure to maintain.
Create a
.pipefile and start building
🔧 Building your first pipe
All pipelines are recognized with the
*.pipeformat. Each pipeline and configuration is a JSON object - but the extension in your IDE will render within our visual builder canvas.All pipelines begin with source node: webhook, chat, or dropper. For specific usage, examples, and inspiration 💡 on how to build pipelines, check out our guides and documentation
Connect input lanes and output lanes by type to properly wire your pipeline. Some nodes like agents or LLMs can be invoked as tools for use by a parent node as shown below:
You can run a pipeline from the canvas by pressing the ▶️ button on the source node or from the
Connection Managerdirectly.View all available and running pipelines below the
Connection Manager. Selecting running pipelines allows for in depth analytics. Trace call trees, token usage, memory consumption, and more to optimize your pipelines before scaling and deploying.📦 Deploy your pipelines to RocketRide.ai cloud or run them on your own infrastructure.
Docker — Download the RocketRide server image and create a container. Requires Docker to be installed.
docker pull ghcr.io/rocketride-org/rocketride-engine:latest docker create --name rocketride-engine -p 5565:5565 ghcr.io/rocketride-org/rocketride-engine:latestRocketRide Cloud (👀coming soon) — Managed hosting with our proprietary model server and batched processing. The cheapest option to run AI workflows and pipelines at scale (seriously).
Run your pipelines as standalone processes or integrate them into your existing Python and TypeScript/JS applications utilizing our SDK.
Use it, commit it, ship it. 🚚
Useful Links
- 📚 Documentation
- 💬 Discord
- 🤝 Contributions
- 🔒 Security
- ⚖️ License
Made with ❤️ in 🌁 SF & 🇪🇺 EU