iztro OpenAI MCP for Railway
Remote MCP server for Zi Wei Dou Shu / Tu Vi Dau So tools, prepared for Railway and OpenAI ChatGPT developer mode.
What I found
- As of 2026-04-24,
@feida/iztro-mcp-serveris not returned by npm registry. - The active public package is
@xzkcz/[email protected]. - npm publish timeline for
@xzkcz/iztro-mcp-server:1.0.3on 2025-07-261.0.4on 2025-10-032.2.1on 2025-12-03
- The package metadata still points to
https://github.com/xzkcz/iztro-mcp-server.git, but that GitHub repository returned404during verification on 2026-04-24. - The published package itself is small and lightweight, but its entrypoint starts with
transportType: "stdio", so it is not directly usable as a public remote MCP endpoint for OpenAI without a wrapper.
Why this wrapper exists
- OpenAI remote MCP works over Streamable HTTP or HTTP/SSE.
- The upstream npm package is
stdioonly. - This project recreates the useful read-only astrology tools and exposes them at:
GET /GET /healthPOST /mcpand related Streamable HTTP trafficGET /sse
Included tools
get_astrolabeget_horoscope_decadesget_horoscope_agesget_horoscope_yearsget_horoscope_monthsget_mutaged_places
gen_astrolabe was intentionally left out because it writes files to disk and is not useful for a public read-only MCP deployment.
Surface Pro 4 fit check
- Resource-wise, this workload is light enough for a Surface Pro 4 with Core m3, 4 GB RAM, and 128 GB storage.
- Published package sizes checked from npm:
@xzkcz/iztro-mcp-server: 67,010 bytes unpackediztro: 2,138,358 bytes unpackedfastmcp: 1,332,900 bytes unpackedlunar-typescript: 1,360,545 bytes unpacked
- Practical issue: to let ChatGPT call it, you need a public HTTPS endpoint. Because this machine did not already have Node installed and local hosting would still need tunnel or public ingress, Railway is the cleaner deployment target.
Deploy to Railway
- Put this folder in a Git repository.
- Push it to GitHub.
- In Railway, create a new project from that repo.
- Railway should auto-detect Node and run
npm start. - After deploy, your MCP endpoint will be:
https://YOUR-APP.up.railway.app/mcp
Health check:
https://YOUR-APP.up.railway.app/health
Landing page:
https://YOUR-APP.up.railway.app/
Connect from ChatGPT
As verified from OpenAI docs on 2026-04-24:
- ChatGPT Developer mode supports remote MCP over
SSEandstreaming HTTP. - Supported auth modes there are
OAuth,No Authentication, andMixed Authentication.
Recommended setup:
- Open ChatGPT on web.
- Go to
Settings -> Apps -> Advanced settings -> Developer mode. - Turn Developer mode on.
- Click
Create app. - Use:
- Server URL:
https://YOUR-APP.up.railway.app/mcp - Authentication:
No Authentication
- Server URL:
OpenAI Responses API example
{
"model": "gpt-5.4",
"input": "Lap la so cho ngay 2000-08-16 luc 2 gio sang, nam, locale zh-CN",
"tools": [
{
"type": "mcp",
"server_label": "iztro",
"server_url": "https://YOUR-APP.up.railway.app/mcp",
"require_approval": "never"
}
]
}
Because every exposed tool here is read-only, require_approval: "never" is a reasonable default.
Run locally later if you want
npm install
npm start
Local endpoints:
http://localhost:3000/mcphttp://localhost:3000/ssehttp://localhost:3000/health
If you want ChatGPT on the public Internet to call a local machine, you still need a public HTTPS tunnel such as Cloudflare Tunnel or another ingress layer.
Windows one-click local hosting
This repo now includes helper scripts for Windows:
powershell -ExecutionPolicy Bypass -File .\start-local-public.ps1
That command:
- starts the MCP server on local port
3000 - tries a public HTTPS tunnel with Cloudflare Quick Tunnel first
- falls back to
localhost.runif Cloudflare is unavailable - prints the public MCP URL you can paste into ChatGPT
When using the local public tunnel, connect ChatGPT to the /mcp URL, not /sse.This project is intended to be used over Streamable HTTP for OpenAI MCP clients.
To stop everything:
powershell -ExecutionPolicy Bypass -File .\stop-local-public.ps1
The currently active public MCP URL is also written to:
.\.runtime\public-url.txt
If the startup window does not show the URL clearly, you can recover it with:
powershell -ExecutionPolicy Bypass -File .\get-public-url.ps1
Create a clean Railway zip
To generate a clean deployment bundle without node_modules or runtime logs:
powershell -ExecutionPolicy Bypass -File .\pack-railway.ps1
That creates:
..\iztro-openai-mcp-railway-ready.zip